Goood morning, everyone! How was your night? Are you ready for some more PubCon craziness? I hope so because, OMG, YOU’RE ABOUT TO GET SOME!  I highly expect this to be a totally awesome talk.  And apparently so does everyone else because all the tables were scooped up way before the talk started.  It was weird – conference attendees in Vegas actually attending the sessions. I can’t even wrap my mind around that yet.

All the big shots are here. Danny Sullivan is in the room. Derrick Wheeler is here.  Eric Enge.   It’s pretty cool to see life so early.

Brett asks everyone to turn to the person next to them and share their worst travel war story.  I suddenly begin staring intently at my laptop, hoping no one notices. Strangers, dude, strangers.

Luckily, it’s now time to start. Up on stage we have Google’s Matt Cutts and Amit Singhal. Up first is Matt. He has slides.

Someone asked Matt what was his favorite pubcon tweet was from yesterday. He was thinking about it. Turns out it was when someone tweeted about Leo Laporte’s statement that search engines would be irrelevant in six months [read our coverage of that panel].  He read that tweet while alone in his hotel room so he wants to show us his reaction.

To do so, he takes a sip of water and then SPITS IT OUT all over the stage.  Awesome.

He asks Brett if he knew this – that search was going to shut down in six months.  Is Brett going to cancel PubCon Hawaii? Heh.

SEOs will be out of a job in six months? If you took all the times people said SEO was dead you’d have a zombie character. That’s everyone’s favorite meme.  SEO is not dead because SEO is a type of marketing. And marketeing appeals to human nature and that’s never going to go away. There’s really useful stuff that we provide. He thinks of SEO as coaching, not marketing. There are people who have the equivalent of an online resume. An SEO makes sure they put their best foot forward. There’s nothing wrong with that. There are tons of white hat SEOs and a very small number of blackhats – but for the most part, you’re a coach. You’re teaching someone how to interview better.  There will always be a role for people who want to present themselves better. If you look back a few years ago, SEO may have been more mechanical.  Do you have the right keywords, do you have the right titles.  Now it’s more about appealing to human nature.

Search is about change.  Things are more complicated now. These days, SEO and search is a different type of challenge than it was in 2001.  Now we have mobile. We have voice. We have maps. In SEO, the way that you present yourself, the only constant is change. And the very best SEOs understand that. You don’t want to go where the search engines are. You want to go where the search engines are going to be. Search engines want to return the best possible experience. As long as that’s your MO, as well, you don’t need to worry about it.

SEO in 2010: Hacked Sites

Back in 2010, a few people were like “where is the Web spam team”?  He wants to let us know what was going on. Hacked sites were going on. There was this whole side battle going on. To make sure that hacked sites didn’t overwhelm the rankings, they had to take their team and work on it. So that’s where they were and what they were doing.

SEO in 2011: Low Quality Sites (Panda) and Communication

There’s not a ton to say about Panda [Really? Because that seems to be all SEOs can talk about ;)]. It was a change initiated in the search quality group. It’s an algorithmic change. When they evaluate a change, they want to make sure it makes sense most of the time. So if you feel like you were unfairly affected, they want to know about that so they can make improvements.

At this point, they’re giving more heads up if site owners are violating their Webmaster guidelines. You can go to the Webmaster console and get a very good idea if action was taken against your site, if they think your site has gotten better, if your site has no manual action at all, or if they still think you’re in violation of their guidelines. That didn’t exist last year. That’s radically better communication.

Long-term SEO Trend

10,000 foot view

  • Mobile: What is a cell phone? A cell phone is a computer that you carry with you everywhere. You need to be thinking about how your Web site is affected by mobile searches and how you present yourself. They’re selling 35 dollar tablets in India now.  That’s a really big trend.
  • Social: There’s been a LOT of discussion about social here. If you’re working on Twitter or Facebook, that’s essentially a private Web forum. They can only use signals they can see on an open Web. Longer-term they’re definitely thinking about social. Because if they can move from an anonymous Web to a Web where reputation matters, that’s going to make the Web better. That’s going to leave more accountability and provide a better Web. You don’t necessarily have to optimize social for search engines.
  • Local: That’s where the vast maturity of concrete purchases take place. You want a strategy for each one of them.

1,000 foot view

  • Better Page Understanding: In the old days, if you had Googlebot, it was kind of an idiot browser. We’re getting smarter. We can implement dynamic content. We can crawl the pages, process the JS, and interpret that.  They can understand what’s on the page and what actually matters. They’re looking at algos that try and figure out how much content is above the fold. If you have so much stuff (ads) obscuring content above the fold, you may want to think about that. What does the layout of your site look like when someone lands on your page? Do people see content or something else that’s distracting?
  • More Personalized Search: If you start to open up things like Android voice actions, if somehow that doesn’t work, that falls through to Google. And the same thing happens with Siri.  The trend is absolutely toward people sending more personal searches to Google. They’re not personally identifiable, but they’re more personal.
  • Better Tools for Searches: On Monday they released Google+ Pages for Businesses.  Only a few people in the room use the + operator and we’re the most savvy searchers in the world. He wouldn’t be surprised if they offered a literal mode – turning off stemming, synonyms etc.
  • Communications and Transparency: He asks how many people use WordPress. Lots of hands. He asks how many people have been hacked. Lots of hands. Heh. They now give email alerts to tell you if you’ve been hacked. They want to do that for more platforms. They want to turn as much of their decision making inside out as they can. They’re thinking of all kinds of Radical Transparency.
  • Sending info to Google: With Panda, a few people were worried about scraper sites. They don’t want scraper sites to ever outrank the original content. What if when you publish your post you could send that ping to Google to tell them you just published it and Google would then know where it came from. They’re exploring that. It’s still early days.

 

1 foot view

  1. Sign up for Webmaster tools
  2. Sign up for email alerts
  3. Set up “fat pings” when you publish: pubsubhubbub.appspot.com
  4. Subscribe to: Webmaster blog, Inside Search Blog, Webmaster Video Channel

Matt doesn’t think SEO is going to die.  There will always be change and the best SEOs will adapt for that.

And with that, Matt brings up his boss aka Amit. It’s time for Q&A.

Amit: It’s been fun working with Matt for the past 11 years. In the early years, they’d build a monthly index and the March index would always fail. A new index would come out and they’d have some bugs in their PageRank calculation.  His boss would tell him to change this ranking algo that worked with the bug and he’d give it to Matt and say, “dude, find those scraper sites and take care of them”. That started 11 years back. He says he’s a better person because he knows Matt.

AND THEN THEY HUGGED!  Dude! Someone get tissues. It’s a Lifetime movie in the making.

[I missed the question. Something about not providing enough data for enough queries. He wants more data]

Matt: You can get the top 1000 queries that allowed you to get traffic from Google.  Over 96 percent of all sites get all of their queries from those thousand. What if you could get 5,000 queries a day? It turns out the amount of storage space would be 2-3x larger. The Yahoos of the world or the Facebooks of the world have a TON of different queries that drive traffic. They’re trying to make more data available.

Guy was hit by Panda. They dropped significantly for the terms that drive the most traffic. He noticed that sites with lower quality have ended up on the first page.  Why?

Amit: We have been focusing quite a bit on high quality sites this year. Their preference is always to do things algorithmicly because algos scale. They scale across languages, they scale across verticals, etc. Panda has been a huge algo change that favors high quality sites. We have tested it purely and it has been a positive change among all known measurements. Of all our measurements are saying we have done a great thing for users and for the Web ecosystem. Google is built on a healthy Web. If the Web ecosystem remains healthy, Google will remain healthy.  No algo is perfect.

Matt: One misconception is that Google isn’t listening or that we don’t hear what people are saying. We actually have a spreadsheet taken from a Webmaster forum with 500 sites that they’re looking at to see if they made mistakes. But, in general, they think it’s working well. THere’s a guy on his team whose entire job is to identify false positives. It’s still the case that it’s under active development. They still have a lot of people working on it. They want to get things right. It will just take time.

When we search for home appliances, we get Sears, Cosco, etc.  There are a lot of guys who have niche sites from specific products. What is the reason we only get the big brands? You’re trying to make the algo so perfect, but do you think it’s losing relevance? 

Matt: The Web is one of the only places where if you’re a small business you can move faster than the big guys. You can be the mammal, instead of the dinosaur.  Now, the big guys are sometimes big for a reason – they have real-world reputation.  We want to actively think about small business and how we can help them. Webmaster tools is how we help them. Big brands will do dumb things in Flash, where the small guys are smart enough to do text. In his experience, it is the case where sometimes what’s on the Web reflects what is offline.  If you concentrate on a very specific niche, you can outrank the big guys. You can concentrate and double down.

Amit: As we have realized in the last year, returning high quality sites is what users really, really like. We have seen all of our metrics go up and to the right as we have released all these changes. Clearly there is something about these big brands. We don’t do things at the cost of relevance. The definition of improving search quality is to give more relevant, higher quality results. They don’t give much attention to what TYPE of site gets ranked [coughCOUGHcough]. Our job is to optimize for a large spot of queries and measure it as scientifically as possible.

During one of the sessions yesterday, the quote was “Google doesn’t care about quality”.  [He’s talking about the local search panel we covered yesterday where someone said Google doesn’t care about Google Place review quality]. He gets robo calls all the time. He knows Google cares about quality.  Can there be a better process for us to help Google improve t he quality of the content?

Matt: Amit is a voice of purity in the algorithm.   They have spam fighters who can take action in 40 different languages. The Web was the wild west for awhile. It’s not quite the same way anymore.  There’s still a little element in the wild west in local.  His group helps the maps team, they don’t handle directly the spam on map, but we share best practices. But that area is changing very fast. He thinks they will make a lot of progress.  They’re open to outside feedback.

SSL Search – we lose our keyword data. On the other hand, user privacy is very important. 

Matt:  Search is becoming more personal. And he expects search will grow more personal in the coming months. They’re trying to get ahead of that.  Some people are unhappy about losing their keyword data. 96 percent of sites get all of their queries if they download their data from Webmaster tools. Another thing people get mad about is that advertisers get the data but users do not.

If you’re an advertiser and you’re advertising for a Hilton, you don’t want to pay for Paris Hilton. If you don’t see the query was “paris hilton” you don’t know to do a negative match and you’re paying for bad traffic. If people are paying for something that doesn’t perform, that’s really bad for user experience. If people are paying for it, they need to see what converts and what doesn’t.  The pragmatic reason is if you’re an advertiser and we tell you the referrer won’t have the query – you’re going to be unhappy and you’re going to write a blog post. AFTER that if you’re one of those 96 percent of sites, you’re going to download the top 1,000 queries that you get your traffic from and you’re going to write exact match ads.  By doing that, you’ll be able to re-engineer the queries and figure out what they were anyway. They’ll keep listening to feedback but they don’t expect them to back off SSL.  If anything, they might move forward to where advertisers don’t get data.

Is there a way we can get back the SSL data? Because we think it’s essential for SEOs.

Matt: No. There are no plans in the future. We do believe in the privacy. If users are logged into Google, they’re more likely to be doing a search that’s personal and they don’t want that to show up in the referrer.

Recently there’s been talk about press releases being flagged as duplicate content. If you have a release you want to syndicate out, what are somethings you can do to make sure your release doesn’t get flagged?

Matt: It’s much better if people are wanting to write about you, than you trying to get them to write about you. In his experience, the links coming from press release are only good when you’ve convinced someone to write about you who WANTS to write about you. He doesn’t think there’s much value from press release links.

We can’t go back in time and get all that query data out now, so is Google going to give us some sense of if you’re going to extend it.  Would the audience want to go back in time to get the old data or do they want the remaining 4 percent?

Matt: We can either give more data to Webmasters (the top 2,000 queries every day) that will help those 4 percent of sites OR instead of seeing the data for the last 30 days we can go back 60 days. They’re pretty agnostic on this. They’re happy to go one way or another.  Matt asks the audience: Would you rather get MORE search queries? – 40 percent.  Would you rather be able to go further back in time? – 60 percent

In a tweetable format, what does Google think is a high quality site?

Amit: If your child can learn something from a site, that’s a high quality site.

And we’re done.  Woah.


About the Author

Lisa Barone

Lisa Barone co-founded Outspoken Media in 2009 and served as Chief Branding Officer until April 2012.


9 thoughts on “Hot Google Topics & Trends Matt Cutts & Amit Singhal


  • MikeTek on said:

    “If you’re an advertiser and you’re advertising for a Hilton, you don’t want to pay for Paris Hilton. If you don’t see the query was ‘paris hilton’ you don’t know to do a negative match and you’re paying for bad traffic.”

    …except Google doesn’t actually show you that in Google Analytics – you have to hack it. They just show you the keyword that triggered the visit. In other words, you’ll still just see “Hilton” in your reports.

    “If users are logged into Google, they’re more likely to be doing a search that’s personal and they don’t want that to show up in the referrer.”

    Breathtaking logic there, Matt. Thanks!


  • Dinesh on said:

    Awesome post Lisa, When I was reading the post, I’m feeling i am attending the live conference. The really like the SEO Trend 10,000 feet and 1000 feet view.

    Thanks,


  • Ryan Bradley on said:

    Sending info to Google can be a good solution to scraper sites but then you would have to do that for all search engines (I don’t believe in a Google only mentality). I think it would be better if the search engines can check the server upload time when content is added, possibly via xml sitemap.


  • David on said:

    Nice summary on the latest search expertise from Google.

    As Matt says, i think local search still has a lot of growing to do over the next couple years.


  • Steven Ferrino on said:

    Awesome recap Lisa.

    “Brett asks everyone to turn to the person next to them and share their worst travel war story. I suddenly begin staring intently at my laptop, hoping no one notices. Strangers, dude, strangers.”

    I was sitting directly in front of you and debating turning around.


Leave a Reply

Your email address will not be published.

*


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Comments links could be nofollow free.