Ask the SEOs

March 10, 2011
By Lisa Barone in Internet Marketing Conferences

We’re down to the last session!  If you’re still with us, I want to say thank you. I know it’s been a virtual fire hose the past few days, but we’re not done yet so hang in there!  Up next is the coveted Ask the SEOs session with some of this industry’s greatest and most decorated veterans. Speaking we have Greg Boser, Bruce Clay, Vanessa Fox, Todd Friesen, Stephan Spencer, and Jill Whalen.  You don’t need any banter from me. Let’s just do it!

We have some long introductions for everyone. Bruce has been in SEO for 187 years and has offices on 5 continents. Vanessa is the voice of Google even though she’s not from Google. Todd Friesen works for Performics and has offices in more countries than Bruce.  Greg is President of Products & Services at Blueglass and has been around as long as Bruce, even though he’s not as old as Bruce. Jill has one office and it’s in her house, she’s been doing this as long or longer than anyone. Stephen Spencer is the man most likely to introduce you to a loophole you’d like to use.

Feel introduced? Let’s begin.

What happened with the Panda update? What signals are good or bad?

Vanessa: I spent 25 hours writing an article on The Farmer/Panda Update – are you kidding me? Danny says she needs to give it to us in a tweet. Hee!

Greg: It punished people for monetizing in an ad network that wasn’t theirs. ZING! They’re making site-wide judgments about overall quality. If you have a large volume of content that would be shallow and probably doesn’t have a lot of corroborating [new game: each time Greg says ‘corroborating, take a drink! Trust me.] sources like external link support, you’re probably not going to rank as well anymore, even for longer-tail stuff. They’ve decided now that people who make content are spammers. [No, people who write SHIT content are spammers!]

Vanessa: The key there is great content.  [Thanks, Vanessa!]

Jill: She reviewed a small handful of sites that were doing well before the Farmer/Panda thing. Some things she noticed that a lot of the sites seemed to have lots of content behind a tab.  In Google’s text cache, they were indexing all this content. A lot of the sites seemed to have that. Also, it was a lot of sites that gave garbage information that led you off to ads or other sites. It’s content that seemed relevant on the surface, but really isn’t.

Bruce: There’s probably a lot of signals here and it’s possible to violate some of them and not violate all of them. If you have a site with 10k pages and every page has 500 words of content, there’s a good chance it wasn’t really written for an audience.  There was a release a month or so ago where they mentioned that if you go to Advanced Search you could see the complexity of the pages. When you did a search it would tell you whether it was high, low or medium. He thinks that may have been some of the factors.  They’ve been looking at Kincaid writing styles and they found that the pages that appear to be getting penalized are the sites that don’t match the top-ranked sites anyhow. If the top ranked sites had a certain type of characteristics of the writing, the sites penalized didn’t come close. He thinks it was a measurable style in the factors that would influence it.

Vanessa: Just create high-quality content because the algorithms are going to change over time to find high quality content. [Isn’t she so cute you can’t even stand it!]

Stephan: Google is looking for corroborating signals [DRINK!]. If it’s high-quality content it’s going to get tweeted, stumbled, etc.

Greg: Greg is all about footprint modeling.  Look at all the things that the good/bad sites share. You can paint an accurate model with an algorithm that a human view said to be poor quality. That’s the best approach a search engine can take. That means you have to pay attention to whether or not your site falls into that footprint.

Todd: It’s interesting Google is modeling things off what people DON’T like instead of what they DO like.

Vanessa; Well, they are. That’s what all the other signals are about. One other thing all the sites had in common were that they had ads above the fold.  Maybe that’s not a good experience for users. Maybe they’re worrying about that.

Danny: Google was really specific that they didn’t use the Chrome block data to make the algorithm, they made an algorithm and compared it to what people were blocking. They had 80 percent coverage. Now they have a feature where people can block stuff without the extension. Danny thinks they will look at that.

Are there trends that you see becoming important in the future that aren’t currently a common SEO practice?

Greg: I think the trend of how you launch and deploy new content is going to have to change. The idea that “more is better” is not true.  There’s only so many hours in the day to crawl the Web. If you think how much time Google has to waste crawling crap to find the very small sliver of content that Google will show to its users, it’s not very efficient.  When you start a new site, you don’t unleash 50,000 pages. That’s not going to be a good site.  Release content slowly as you develop the site over time. Take that slower, more metered approach.

Stephan: He thinks internal linking structures will become more fluid.  You’ll be able to have more dynamic intelligence driving what the links are on these various levels of your Web site and correlate that with your rankings and your desired rankings for various terms. You’ll come up with a keyword URL map to find the top keywords you want to go after.

Bruce: We’re going to see a shift, industry-wide, that we need a better understanding of the intent of every query, behavioral search, personas, how people do consecutive searches, communities, location of the searcher, that the intent of the query is based on where you’re at. We’re going to have to understand devices.  Where you’re at and the device you’re using is going to be important. The site that best satisfies the searcher is the site that “matches”. It’s all going to be keyword community-driven.

Jill: What Bruce is saying means you can’t rely on tricks like you used to. It’s really about doing the basic things right. It’s less about the “newest” things that Google is going to do. You have to meet the search query.  Have a purpose for every page of your site.

Todd: SEO is slowing down and becoming a lot more thoughtful than its been in the past. You can’t just release a million pages at once. You can’t get links with one specific anchor text. The Farmer/Panda thing is based on modeling.   [Vanessa starts squirming so Todd stops.]

Vanessa: If what you’re doing is looking at pages and anchor text/link distribution than that means maybe you’re not doing things for the right reasons.

Greg: That’s silly, Vanessa. That’s our job.  Of course we’re looking at data. The days of “creating Google content” and waiting for the Google leprechaun to come and find everything for you is over.

Vanessa: I don’t think modeling yourself after other sites and their anchor text/link profile is going to help you. You need to think about who your customers are and what they want. You don’t need to build a certain anchor text pattern. That’s not what the search engines are looking for.

Stephan: You need to watch that stuff when you come in on an account trying to figure out what went wrong.  [Vanessa agrees]

Todd: You don’t want to be the sore thumb sticking out.

Greg: That is the approach we take. I dissect and look and look at the space to spot trends and see where clients are exceeding obvious thresholds. He can come in and say you don’t need any more links that say “blue fuzzy widgets” because it’s going to get you filtered and make sure you don’t rank. Knowing the makeup of an individual space based on a link map is important. You learn a lot that way.  [Greg calls Vanessa a wet blanket!  Hee!]

Forget Farmer, how can I overcome a negative 50 penalty?

Greg: A minus 50 is when you all of a sudden go from wherever you were to exactly 50 spots less than where you were before. You usually have to do bad things. It’s definitely a manual kind of thing. When they do it is what’s kind of sad/funny is that you’ll search for your name and you’ll be at 51 but you’ll still have your site links.  It usually requires saying you’re sorry and cleaning up. And a reconsideration request.

Todd: The cleanup is probably going to be a lot more than one thing. To wind up in a true penalty it’s probably not because you were in one particular link network.  You need a rap sheet at Google.  Google’s keeping track of what you’ve been doing on and off the site and there are flags all the way along. Google can pop up red flags all over and no one particular is going to get you a penalty.

Greg: It’s the repeat offender penalty. There’s usually a bad track record of you and your site. The people he’s seen it happen to have a pattern over the years of doing bad things.

Stephan: You need to be comprehensive in your clean up and your mea culpa. Blame it  on a rogue SEO. ;)  You can’t still hold cards to your chest.

Greg: The worst thing you can do in a reconsideration is NOT be truthful.

Jill: If you have a penalty that bad, you know what you did. It’s not an accident.

Bruce: In the last month in a half he ‘s had two Fortune 100 companies come to him with a negative 50 penalty. Both times they found out one small division of the company decided to buy links. It was all under the main domain. So that little guy buying the links knocked down the keyword on the overall domain. It took 3-4 weeks to make sure all the links were pulled and now they’re back up to the front page. Google will knock you down.  The hard part is the research.  In larger companies you have to be a little bit more careful.

Stephan: Sometimes it helps to talk to a Google engineer at a conference like this. But do bear in mind who you’re talking about. Someone in the Kirkland, Washington team doesn’t have the same access as the Web Spam team.

Vanessa: It’s super unlikely that you can even talk to a spam engineer and they lift a penalty from your site. You still will have to go through the reconsideration request form. When you file it, it will still go to the spam team just like it would any other time. It’s always them who handles those requests.

Many of our clients on IAS and .net where a 404 error is not possible. If you can’t get a page to generate a 404 error, what should you do?

Greg: Get off the Web.

Vanessa: There is a way to change that.  You probably need a better developer. It’s not easy, but you can do it.

If a site is penalized is it okay to just changed URLs and redirect?

Greg: Some penalties do transfer via redirect and some don’t. He used to do a lot of “testing” to see if it was a penalty by doing redirection to see how a new site responds. You never want to file a reconsideration request unless you’re sure you’ve been penalized because every site has some skeletons. That’s a last resort. In general, getting around a penalty by trying to redirect your way out of it is not a good one.

Are you doing things now to accommodate Bing now that they have 30 percent?

Greg: There’s a gap in where they are/their ability to do the same kind of stuff that Google does. The kind of brute force style search engine optimization that used to work in Google now causes problems, but it makes you rock in Bing.

Stephan: If you allow penalty-type material on your site because it’s ranking in Bing, you’re shooting yourself in the foot.

Vanessa: Those aren’t the types of things she’d recommend people look at. From a technical perspective, the things that are obvious is that Bing can’t crawl Javascript, the canonical attribute isn’t used as much, etc. If you try and think that you have all this stuff on your site from a technical perspective, you still are probably going to lose out in Bing.

Bruce:  Since Google has over 90 percent in the 9 US areas, that clearly you’re going to target Google if you have any kind of international presence. Bing eventually will copy all of Google one way or another. He’s more concerned about what’s going to happen in Facebook or YouTube. If he was going to worry about how people were going to rank or be found, he’d look there. He’s more concerned about local search, rather than Bing.

Greg: And then there’s the dual site strategy of having a different site for each engine. It’s not a long-term strategy.

Final Top Tips

Stephan:  You don’t show up without really powerful SEO technology. The tools are only as good as the operator, but you’re also only as good as your tool. Say you have a tool for monitoring audit scores and you’re using tools to measure linkbaiting/social media/etc, if you don’t have industrial strength tools to help you do an effective job you’re going to have a hard time. Use technology to make it easier for you to do things across large-scale sites compared to people working through them by  hand.

Jill: Don’t believe anything that anyone tells you. Test it for yourself. Every site is different and every site needs different things.  [Todd feels like Jill just said not to come to search conferences. Hee.]

Greg: Start spending time looking at corroborating signals to find out what’s going to be important in the future. You can’t ignore the human engagement part the opinions that are made about your site.

Todd:  Pass the information between SEO and PPC. Don’t buy them in different departments. You can find terms getting massive ROI that you’re not even chasing.

Vanessa: For a long time a lot of SEO has been about links and we’re starting to see a shift now toward user experience. There’s a lot of data that’s already available – Google Webmaster reports, your analytics, etc — use it.  If you get a lot of traffic but you have huge bounce rates – somethings going on. If you rank well but aren’t getting conversions – somethings going on. Look a those types of things.

Bruce: The search results page is changing a lot. There’s more items contending for that space. We’ve found first page Google results with as few as four organic links on it. In 2 years he thinks 70-80 percent of queries will have a local result on the page. We’re going to have to learn to cope with local. What used to be on-page is going to be on-site.  Hug your PPC counterpart, but buy beer for your analytics team.

Danny: For good SEO, get good on Twitter and Facebook because they’re the good (easy) linkbuilding. The social signals are an important signal.

 

And we are OUT of here! I hope you found both the SMX and Pubcon coverage useful this week. We’ll see you on Monday.

Content Strategy
Content Strategy

4 Words Of Non-Wisdom Bloggers Give

on Aug 23 by Lisa Barone

There’s a lot of bad advice on the Web. We know this and we’re all getting pretty good at ignoring…

Internet Marketing Conferences
Internet Marketing Conferences

Building a Link Development Calendar

on Apr 2 by Michelle Lowery

And now, the session you’ve all been waiting for! Okay, I may be a little biased. But the truth is,…

Internet Marketing Conferences
Internet Marketing Conferences

Lessons from a First Time Speaker

on Apr 27 by Dawn Wentzell

I recently spoke at both Search Marketing Expo Toronto and Pubcon South in Dallas. Now, neither was my first time…

^Back to Top