Okay, let’s get in the mood, shall we? Turn up some Eye of the Tiger, go change into some sweats, and get the blood pumping with some jumping jacks. Maybe run up a few flights of stairs. You feeling it? Good. Because we’re about to talk about how to get your site back in shape and redeem itself after the Panda update. Speaking we have Alan Bleiweiss, Micah Fisher-Kirshner, and Mark Munroe. I’m not saying he’s trying to bribe me, but Alan gave me a Dove chocolate bar before sessions started this morning. FULL SIZE!
Onto the presentations…
How do you recover if you were hit by Panda? How do you insulate yourself from Google?
To start us off we have Mark.
He says he had never said the word “panda” before Google’s update. Now it comes out of his mouth several times an hour. Hee.
Feb 24: People looked at their traffic and a lot of it was gone. It became clear what happened. Panda had hit. It was a site-wide hit. There’s no correlation to page and content, almost everything is pushed down with normal fluctuations. Panda is only part of the update and things that went up where still giong to go up, which was confusing. He spent a lot of time going through looking at the types of sites that were hit. There were a lot of commonalities.
- Content Explosion
- Thin Content
- Heavy content aggregation
- Heavy ad/content ratio
- Article & Q&A/ Content Farms
- Dated UI
- Collateral Damage
What was interesting was that there were outliers to all these cases. Certain verticals seemed to be hit harder than others.
Impact of a Site Wide Hit: Content experiments will not yield results. Don’t spend a lot of time updating individual pages and measuring rankings, you might get improvements. Don’t read a lot into pages that got hit by Panda.
What is Google Interested In? Think like a Googlelonian (whatever that is). Google cares about their customers – their pages. The SERP page and the SERP experience. To recover, improve the SERP experience. You have to help Google.
Can Google judge page quality?
- Sometimes a short or thin content page is great.
- Sometimes a wonderful looking and designed page will fail because the user’s implicit need is not met.
- Sometimes a page will fail because of aggregate content
- Sometimes a page will succeed with aggregated content
- Sometimes a page will fail because of bad titles
Panda is an admission by Google that they can’t identify the quality of an individual piece of content. If they could, they would and you would not have site-wide panda hits. It would make so much more sense only to demote bad content.
How google used to ID quality: Reputation. Content was binary – either spam or duplicate. SEOs loved that. Over time, Google changed that. Things are different now.
Google is using behavioral metrics. Just like use behavior on your own site is the best measure of success, the same is true for Google. Those metrics are influenced by the Web pages that show up in the SERP. But they cannot judge an individual page on user metrics because there’s not enough data. Thus, they had to expand the concept site-wide, that way small sites can achieve a critical mass of data over time.
Google is looking at signals that indicate a positive or negative SERP experience.
- A G-bounce within a time window and an alternate selection from the SERP.
- Query behavior after a G-bounce.
- Average time on before G-bounce
- Click through rates
- Repeat visits
Where will google get their data? He thinks:
- They won’t use Google Analytics
- Won’t use tool bar
- Will only use data available on their sites
- Will only use data that is broadly applicable to all sites
- Based on a representative sample of all users
Does it matter if it’s behavioral?
Changing the focus from your site’s usability to the SERP experience will reveal new strategies. It is important as it relates to new SEO strategies going forward. If it is behavioral, it is based on data collection. It takes time to collect data to reach statistical significance. If you get hit, you get less traffic, less data collected. It could potentially take a long time to collect enough new good data to counter your bad history.
Google likes big brands. They were exempt from Panda, from the most part.
Going forward your SEOs better be thinking about user experiences. Your designers and UXEs better be thinking about search. Know Thy Searcher
- Understand the user who is landing on each of your pages.
- All search users have a question. You control what questions get to your site by your content and your titles
- Start off by making sure you understand people who land on your page.. Do a survey, find out the top things they want. Make sure you survey people who are representative of your audience.
- Do search usability testing.
- Look for bad keywords
- Look for easily identifiable content. Beware of content that is hidden behind read more buttons.
- Make sure you have good titles and that they’re indicative of the content on the page. Otherwise it will bring in traffic that you can’t deliver on, which will cause a bad experience.
- Review your keywords. Which terms are driving traffic?
- Beware of content for content’s sake.
- Link freely to relevant content. If you can’t give a user what they want, link them off to where they can get it.
- Don’t annoy users with too many ads, opening 10 windows like travel sites do, etc.
- Slower the page load, the quicker someone will bounce.
- Standard bounce rates as reported by most analytics packages is extremely flawed. They only look at a single page visit. There’s no differentiations between a 3 second bounce and a 90 second bounce.
- Time on site is equally flawed.
- Monitor and improve your SERP clickthrough rate
- It would be great to track the actual back button to capture bounce and time to bounce
So that was a long-winded way of saying “create a site users love”, eh? Matt Cutts would be so proud.
Alan is next. Danny says he put Alan on the panel because Alan told him he’d be sorry if he didn’t. That remark gets lots of cheers from the audience. He’s obviously a crowd favorite.
Alan starts with a disclaimer: Everything he’s about to present is his opinion based on his situation. Well, yes, I think we figured that.
There are two kinds of SEO:
- Myopic SEO: Focuses only on Google based on magic bullet methods
- Sustainable SEO: Focuses on User Experience as seen through the eyes of search bots and algorithms.
He shows some analytics for some sites that were hit by panda to show the drop in traffic. People’s live and careers have been jeopardized by this. Interestingly, the sites hit by Panda showed some signs this was coming through the May Day update.
Alan says when you go about things the right way, you have to be less concerned with Google’s changes.
- Topical Cross Contamination
- Text Anorexia [There’s a controversial term…]
- Internal Link Poisoning
- Unnatural Off Site Patterns
- Sheep to slaughter competitive evaluation
- Provide multiple consistent signals regarding topical focus
- Does this feature confirm or confuse topical focus?
- Does this page overwhelm the sense?
- Off site diversity (inbound links, social mentions, reviews)
He shows a Web page that includes tools, topics, tags, news, categories, tips, forums, etc. It’s an example of topical dilution. They’re trying to do everything, but it makes them about nothing and it’s confusing to a user. He shows a couple more examples of Web pages that represent Myopic SEO in his eyes and then shows how they’re different from pages laid out with Sustainable SEO. Basically, the sustainable SEO pages are a lot more focused and user-centric.
Myopic SEO often has inbound link issues. Too many of their links are brand-focused. That’s great for your brand, but it’s not good for people doing a general search on what you offer. That’s tunnel vision SEO.
Sustainable SEO Benefits
- Higher ranking in Google
- Google roller coaster resilience
- Higher rankings in Bing
- More conversions
If you do Sustainable SEO, you’ll not only be safe in Google, you’ll also have a much broader reach in Bing/Yahoo, YouTube, or more niche market search engines.
Bing Inbound Link Treatment
- Diverse relevant inbound link anchor text
- Get links on bing ranked sites
- Laser targeted anchors to relevant page
Bing has more difficulty discovering content. Submit Sitemap.xml to Bing Webmaster Tools
Bing Facebook Deal – don’t just push content, engage for authority
Sustainable SEO anticipates where users are doing today and what they’ll be up to next.
Next up is Micah.
The Event: Panda Attacks
Monitoring system goes haywire, traffic dropped 20 percent. It’s not a good day.
What do you do?
Is the data fully in? It takes time for that. It means you need to maintain a good relationship with your ops team. If they’re telling you to file a bug, that’s not a good sign. Massive events require flexibility within the organization. Utilize Google Analytics hourly reports with advanced segments.
Who else is affected? Find the limits of that event. SEO affects everything and everything affects SEO. Walking over to a person and being able to say, “hey what’s going on” helps you get an indication of what’s happening. This is NOT the time to send an email. Get up and speak to them so you can find out what’s happening right now.
Algorithm update rumors? Read, read, read! Read what everyone is saying. Focus on forums and search news sites. Remember, just because you find one issues, does NOT mean this is the only problem.
What was recently launched? Keep an event log. Sometimes a product launch from a month back can be the cause. But the engineers. Not every detail is always written down. Watch for rollbacks that undo critical changes.
What areas are affected on your site? Segment in any way you can. Keyword groupings, keyword length, keyword traffic level, motive, page groupings, domain/site groupings, etc. Get to love playing with your data.
Did anything break? Lets assuming everything you’re doing is white hat, right? Sometimes broken or forgotten processes can lead to a broken Website that looks like its black hat SEO. Know what is fundamental to your site’s SEO. Backend functions are the easiest to miss. Worker transitions always miss certain processes. Go back to ops team to run through the SEO checklist.
Go back to the algorithm update rumor. We find confirmation that something happened. Now what? Data collection time.
Who’s talking? Recognize the regulars. Skip the broken track record. Always read the important people, even if its just one setnence. Scrutinize the strangers. Read the long commentators, usually something to glean. Short commenters are junk commenters. Jerks will be jerks. Push past their annoyances and listen.
What sites are dropping? Your ranking data shows the severity of the impact. COmpetitive ranking data is essential. Seeing who survives helps provide answers about algo updates. Searchmetrics, sistrix, seoClarity and SEOmoz are just the beginning.
What are the theories? Be a white hat SEO, think like a black hat. Don’t actually be one, just think about the possible impacts.
What theories fit? Find out everything you can. Work ing your business connections, read blogs for “in depth” analysis. Jot down likely possibilities. Make sure you have enough data. Get out in front fast before others do. L
What can you do to recover? Understand how changes were made. Do A/B testing. How to est when the algorithm is overall instead of page-level? That’s the twist. The area of where our industry was testing page level.
And we’re out of it. I’m gonna get lunch, you go save your Web sites.
About the Author
Lisa Barone co-founded Outspoken Media in 2009 and served as Chief Branding Officer until April 2012.