OMG, Guys! It’s the last session of SMX East 2010! Can you even handle it? It seems like just yesterday we were just kicking things off. Now we’re all so much smarter, so much more in-the-know and so, so sleep deprived. This session, however, is going to be a big one. Up on stage we have a bunch of SEOs who need no introduction. However, we’ll introduce them anyway. Give a virtual hello to Greg Boser, Bruce Clay, Vanessa Fox, Todd Friesen, Rae Hoffman, Stephan Spencer, Jill Whalen. I expect the stage to collapse from ego any second. ;) Just kidding.
Okay, this one is straight question and answer so let’s do it.
Is creating a tag cloud of site-related HTML keywords still effective in establishing PageRank?
Do two links to the same page pass twice as much internal link juice as a single link?
Vanessa: If you’re asking that kind of a question, you don’t have enough things to do.
It’s nice to have a lot of links but having the most links isn’t as good as having quality links. Agree?
Rae: She’d be more concerned about the anchor text.
Stephan: He wants to know the context of it – keywords, authority, location on the page, etc
How would you deal with a site where the content is behind a firewall because of regulations?
Stephan: Just let Googlebot in
Todd: But Google doesn’t want to serve up results that users can’t get to. If you absolutely have to, you’re kind of screwed. You’re going to have to create content outside of that.
Vanessa: To get even beyond the cloaking issue, it still would be a legal problem because if they showed it to Google it would show up in the cache.
Vanessa: The spam team at Google might not be okay with that…
[The panelists find out it’s not an age-specific site. The guy runs an financial site and is going after the long tail searches]
Greg: If it’s a situation where everything’s coming through your home page, that’s going to be treated and viewed differently than if you have thousands of pages you’re doing the 1-click free thing on. Intent is a big issue.
Rae: Create category pages and then write abstracts from the hidden pages. Then, link build to the category pages like crazy.
What would be the advantages for having your own URL shorteners? Would you recommend creating your own?
Stephan: If you have a brand, you want to push out your branded URL. Just make sure it’s 301’d.
Rae: She recommends custom URL shorteners. That way you can ensure it’s a 301 redirect. It’s under your control so you can update the link. Also, there was a URL shortener in the SEO industry people used to use…and then it went out of business. So now those links don’t render to anything. It also prevents tweetjacking.
Todd: If some service starts getting really abused, you don’t know when Google will decide bit.ly stops passing juice. If it’s your own, you know that you own it and its yours.
Greg: For branding, the shortener is a great idea. If you think moving forward you’ll get any significant value from that kind of 301, you won’t. Not all 301s are created the same. It may make sense to see if you had any historical record of that URL. Google may come to treat that 301 like a bounce.
Bruce: Remember that people periodically do type in those URLs. If you look at the shortener code, some of that can be typed in incorrectly. At least if they do that on yours, they’ll get your 404.
Jill: While you’re shortening your URLs, make sure to add Google Analytics tracking codes in there.
Stephan: If you can aim for shorter URLs for blog posts/products, do. Marketing Sherpa found that short URLs get clicked on 2x as long URLs do in the search results.
Can people linking to you hurt you?
Bruce: Bruce says no and shakes his head yes. The basic thing about participating in a link farm is it will hurt you if you do it at a high volume. But having a competitor buy links and having them link to you is not supposed to be able to hurt you. He believes that.
Greg: There are ways. In the old days, the way it was combated was that the page selling the links was throttled. If you’re buying links for a term you already rank for, those links just didn’t count. That is a little different as big name networks get hit. Instead of demoting the whole site, you’d disappear for the actual keyword phrase used in these networks. You’d still rank for the pages/keywords you didn’t buy links for.
Stephan: He knows folks who have bought crappy links and torched their competitors.
Vanessa: If a competitor buys links to your site, then Google can tell those are paid links and they won’t count them for value. It depends how clean your site is. If the site is dirty and competitors point dirty links at you, it may tip the scales.
Todd: It’s link profile. If you buy 50,000 links to a site that has 3 million links, you have to go really, really big at that point. It’s not going to have much effect in the larger profile.
Bruce: Over time, the more low quality links you have, the lower you’ll rank.
Do you think most people need to worry about this?
Greg: It’s rare, but when it happens, you’ll know it.
Jill: That’s why you need to have a decent link profile
Rae: That’s why you never piss off an SEO. [Or maybe, why you just never piss off Rae.]
How are you optimizing for TrustRank?
Stephan: The difference between PageRank and TrustRank is that TrustRank is when you calculate the value of links with a trusted seed set, not a random seed set.
Vanessa: There are 200+ ranking signals. So when you hear about one, it doesn’t mean it’s more important than any of the others.
Greg: You need to separate trust and authority because they get interchanged. He trusts his 10-year-old child but that doesn’t mean he’s authoritative. You can be reasonably far away from the seed group as far as trust, but you can still be authoritative on a subject.
Jill: Trust and authority are the kinds of things you can try to optimize for by actually having trust and authority.
Greg: It’s hard to trick people to trust you.
What do you think social is doing in terms of rankings?
Rae: If you have a link that’s being passed around on Twitter, it will get indexed faster. It’s quality signals – it’s not Twitter. The more traffic and the more signals that you send to Google that your site is useful, the better you’re going to do. She also thinks social media is good at getting indirect links and attention from people who wouldn’t have seen you otherwise. You’re not sharing your link on Twitter to share it on Twitter, you’re trying to get indirect links from it.
Todd: The links are nofollowd all over the place. But if you stop and think about the signals within all these social networks, you want your link passed around by people who are real, who have friends, who have influence, etc. There are a lot of quality signals just within the social networks that you can look at to determine if people are real or not. You can’t just create a bunch of fake profiles to click on your links.
Greg: He always thinks what would he do if it was his search engine. He thinks Twitter is an incredible pre signal. If there’s a barrage of links flying through Twitter and then they start discovering links to that page from traditional blogs, that’s the signal connection.
Vanessa: We need to think beyond links. Google started focused on links because that was the only signal available. Now there are a lot of others signals. So who knows in the future what kinds of signals Google will use, but it only makes sense that they’re using all the other signals the way they used links.
Bruce: You can manufacture a lot of sites and tweet and links and build sites, but the concept of trust was from you going out – who links to you and who do they link to and are the sites a second jump away from you trustworthy?
What’s your latest epiphany? Something you thought would take off but didn’t. Or something that surprised you.
Greg: His biggest thing has been the changes in the redirection. A 301 is supposed to be for permanently moved content but we’ve come into a situation where it gets used a lot of ways. Any kind of redirection stuff that is canonical-based in nature they know try to do with the canonical tag. If you want credit for it you have to do it in the confines that Google established.
Jill: Flat site architecture and internal anchor text used to really be great and easy, but they don’t seem to have as much power as they used to.
Todd: He’s been working on crawl efficiency – blocking categories in the robots.txt file. They’re managing the crawl by blocking pages that have no value. They’re getting a better crawl rate and ranking. It’s been working spectacularly for driving long tail.
Rae: She wish she didn’t spend so much time optimizing for Yahoo. Domain trust has also typically been really important and she was surprised by that seeming to be devalued a bit with the Google Mayday update making internal pages of a trusted domain a bit harder to rank.
Stephan: He likes tracking freeloaders – pages that don’t bring any traffic from Google but that are indexed. If you have a reporting tool, you can track which pages are crawled but not bringing in any visitors. You can find them and make some tweaks.
Bruce: They’re added a lot of emphasis on behavioral analysis, attempting to figure out sequences of queries to try and construct some of the content for long tail terms that would be appropriate for a community. They create personas for the person who would be creating the query in the first place. They’re emphasizing more on Local.
Greg: New links on bad content = bad idea.
Vanessa: It’s harder than it seems to figure out the difference between optimizing for the algorithms and optimizing for things that matter. It’s hard to understand what the fundamentals are and separate it from over-optimization.
Jill: What Vanessa says is so true. Before you worry about the extra stuff, make sure the basics and fundamentals are done.
Danny: He’s really interested in Twitter Search. People search by asking questions – “anyone know about ___”. If you’re Dominos, don’t track people talking about Dominos. Find them when they’re talking about pizza and give them a coupon code.
And we are done. Thanks so much for sticking with us. Hope you enjoyed the liveblog coverage.