SEO Ranking Factors In 2009

….and we’re back from lunch. If you haven’t chimed in on the “is nofollow dead” drama going on, get to it. We’ll wait.

Okay, actually we won’t wait. But you can read it later. Danny Sullivan is moderating the likes of Rand Fishkin, Laura Lippay and Marty Weintraub. This should be all kinds of interesting. I almost feel bad for Laura. She’s paired with some crazed, loud mouth boys. Kidding! :)

Oh, Danny Sullivan is trying to overtake my laptop.   He has a message for you…

[Lisa is the most awesome live blogger. I have taken over her computer to say this. But she told me I had to say it. No, not really. It's true. But now i have to go on stage to run the session. Bye, --danny]

Thanks, Danny!  Can we get to the liveblogging now? Stellar!

Rand Fishkin is up first.  He lets us know about the ranking study SEOmoz recently conducted where they matched real data with expert opinion on tons of different ranking factors.

One thing they looked at was what was the most important on page feature. According to the data, the most important on page factors are – URL, Title, and Alt text. An image with alt text on the page seems to have a high correlation for high rankings. Correlation is NOT causation. Just because you have great alt text doesn’t mean that’s why you’re ranking, however, many people who rank very well TEND TO have good alt text. I think that’s an important clarification by Rand. Nicely done, sir.

Importance of keyword usage in various parts of the title tag:  Should you put your brand first? They do on SEOmoz, but if you’re basing it solely on SEO, the research says it’s not the best way to go.  There is a high correlation with ranking well and having the keyword that you’re targeting as the very first words of the title tag.

Substantive Disagreement

  • H1 Tags (low correlation with rankings)
  • Alt Text (higher correlation with rankings)
  • Keyword in URL (very high correlation)
  • Keyword Location (data says closer is much better)

They asked folks about things related to the page or domain (not links or keywords). Things like the recency of page creation, historical content changes, use of advertising on the page, use of Google Adsense, Meta description, etc.   Existence of substantive, unique content on the page was viewed as most important. So were use of links on the page that point to other URLs on this domain.

Google and Yahoo are more trust-focused, while Microsoft is more individual page focused.

[Rand is offering a lot of really great information and I'm not doing it justice trying to blog his charts and graphs (you try putting pie graphs into words!). With any luck, Rand will post his presentation over on SEOmoz later.]

The importance of SEO metrics in Google Search

  • Domains linking to URL
  • Google.com links
  • Yahoo! SE External Links
  • Google.com Toolbar Page Rank

Subdomains vs root domains

Most people feel that content on subdomains inherent some, but not all, of the query-independent ranking metrics of the root domain and is judged partially as a separate entity.   And the data gathered by SEOmoz shows basically the same thing.

High level view of Google’s algorithm

  • Trust authority of host domain – 25 percent
  • Link population of specific page – 22 percent
  • Anchor text of external links – 20 percent
  • On page keyword use – 15 percent
  • Traffic and CTR data – 7 percent
  • Social graph metrics – 6 percent
  • Hosting and registration – 5 percent

Future of Google’s Algorithm: They think links will continue to be a major part of Google’s algorithm. Forty-seven percent say they’ll decline in importance but still be important.

Next up is Marty Weintraub. I have a feeling this isn’t going to be pretty.

It used to be easy to find out the ranking factors back in 1996. The good old days of spamming the bejesus out of search engines is over. It was like stealing back then. “State of the art” is still crude. The public understanding of search rankings is strong to weak on semantic and power nodes.  Especially on the enterprise level. Hot shots do it all by hand to mid tail.  But with cool APIs,  it won’t be crude for long.

Fact: You can’t rank for [Las Vegas hotels] with a new blogspot blog.  You can’t pick head terms by search frequency. But that’s how it was done when things were crude and people are still trying to do it.

With the massive competitiveness of the Internet, you have to factor for attainability. The search engines have bullshit detectors. Pretty words don’t matter. Nor does attainability if nobody cares. Attainability doesn’t mean anything without frequency and frequency is useless with attainability. Deep.

Marty says that PageRank is pixel green already-obsolete fairy dust — and yet it bothers him when it rolls down. Hee. You would think Google would let us buy better.

Dawn of Transparent Next Gen Tools

The public seeks congruencies with SERPs. Fledgling metrics have emerged to remove our blinders. We need to deconstuct the real linking universe and not fully depend on the search engines. Enter LinkScape, SEOmoz’s brazen flagship. It’s the anti-black box for SEOs. It reveals metrics that the search engines won’t tell us. According to Rand, it’s catching up to Yahoo in girth. Marty says that Google must hate LinkScape. :)

The mainstream monopoly on comprehensive enterprise crawl and analyze is over. Black box is not the new black. Build a spreadsheet of metrics where you hypothesize and show correlation of pages to Google SERPs.

  1. East semantic evaluation, multiple sources.
  2. Linkscape, Internal/External page & site scrapes, Yahoo Site explorer (quality of internal anchor text, page trust, page juice, topic relevance of inbound links)
  3. Magestic SEO, WayBack Machine, WhoIs, Alexa, Hitwise

Pulling spreadsheets is a far cry from being programmatic. Understanding page strengths and SERP competitiveness in charts takes HOURS. Define, wire up by API, etc.

Laura Lippay is up.

If search engine optimization was primarily only about inlinks & browser titles, what’s stopping her mom from creating a motorcycle site with great affiliates that outrank Yahoo Autos? Successful search engine optimization is meta tag optimization, affiliate networks, inlinks, keyword density, etc. But it’s ALSO about being viral, having buzz, getting link, and being your own linkbait.

What is that top ranking factor? A good product.

At Yahoo they use Rand’s SEO Ranking Factors, but there’s one small modification. The best ranking factor is having a great product.

To prove her point, Laura asks if her mom had to choose one of these knitting sites, which one should she choose?

  • Knitting World: Yarn articles, knitting news, knitting stories
  • Knitting life: Yarn articles, knitting news, knitting stores, PLUS you can upload videos, there’s local information, you can upload photos, and there’s a whole community behind the site.

If you were a search engine, you’d rank the same site her mother would choose. Because that’s the better site. The best ranking factor you can have is a great product. Make the product manager and anyone involved in strategy your best friend. Before building SEO into a product, ask yourself:

  1. What will it take to be a category-killer?
  2. Exactly what do I need to build into my product to outrank my competitors?

Excellent advice.

Share this post

About the Author

Lisa Barone

Lisa Barone co-founded Outspoken Media in 2009 and served as Chief Branding Officer until April 2012.

Get social with Lisa at Twitter

12 thoughts on “SEO Ranking Factors In 2009

  1. It is extremely hard to believe that the trust authority holds that high regard among webmasters. Mabye I can’t grasp the concepts of where google is headed, but I feel that they still strongly rely on the two that follow which were link population and anchor text. It is crazy to see headings loosing all their value, but of course it would be easy to manipulate your page to fly high in SERPs. Personally though I feel that Laura has the best idea of where Google is headed or is already at, I feel that if there is a buzz in the online communities Google is now advanced enough to depict the buzz and then reflect that in there SERPs. Anyway good post on the theories of Google (WHY DON’T THEY JUST TELL US!!) keep up the good work.

  2. First off, great post. You quickly relayed some excellent information in a clear format. Rand’s breakdown of the high-level view of G’s algorithms makes sense and paints a clear organic picture my views align closely with. Thanks for sharing the wealth for those who were unable to attend SMX Advanced.

    On a side note, within your mini-bio I thought it was hilarious you put “actively neglecting lisabarone.com”, as I too have been neglecting my blog and was therefore too ashamed to place it in the website field above ^ :)

    Thanks again for the SMX lowdown.

  3. I can’t believe Laura mentioned meta tags and keyword density. Surely, she was joking? Or, perhaps, she meant that “meta tag optimization is actually writing compelling descriptions” and “the best keyword density is whatever reads well for your visitors”?

    P.S. You have a –> sticking out after “Related Posts”.

  4. I feel the same as Marty on the green fairy dust, yet when I have less of it, a little sadness takes place. Could use a fairy dust pick me up now and then. :)

  5. Hi Yura –

    I usually give this presentation to a less advanced crowd, but yes – only referencing “meta tags and keyword density” as basic SEO terms to show next to Search Algo-type terms like “Vector space models” and “Agglomerative clustering”. You could also [insert any of dozens of SEO terms here].

    :) L

  6. Again, I am completely surprised by the lack of actual insights taken from the 2009 local rankings. Where is the news here? Or is this supposed to be a primer for people that have just gone from graphic design to webmarketing?

    Get links, write good text, make the site interesting. Key message!!!!

    All the other stuff, 25% here 10% here, is pure speculation fueling the age old adage of write content, be nice to the search engines and you will get some kind of happy smile.

    It is more likely that Google’s mining it’s database for ranking relevant websites in their listings and tracking prevalence and activity via one of the tracking tools that has an 81% change of being installed on the website that Google sends the visitor to from their listings. I mean, we are talking business here and in terms of self-regulation let’s just say that Behavioural advertising is being very closely monitored by the FTC at the moment.

    Trustrank is such a lovely concept for brand managers, it suggest that SE algo’s go beyond a simple fetch and parse and that they are human. Which they ain’t!

    Keep up the good work // filed in my reader under comedy value :)

  7. I am little confused about

    {Substantive Disagreement

    H1 Tags (low correlation with rankings)}

    So should we stop using h1 tags and instead of that can we use h2 or h3 or stop using heading tags at all

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Comments links could be nofollow free.