See, now this session is just going to hurt.  At least y’all will get some amusement as I burn my fingertips off trying to keep up. Ready? Come along.

Danny Sullivan is moderated a vet panel that includes Greg Boser, Bruce Clay, Vanessa Fox, Todd Friesen, Rae Hoffman, Stephen Spencer and Aaron Wall. Ignore that slight whimper you here.  It’s just me crying.

We do intros:  Aaron’s a genius. Todd quit smoking three weeks ago. [high five!] Vanessa is pure. Greg is an Old Fart SEO. Stephan rocks.  Bruce plugs his 700 page book. Danny calls Rae a bitch.  That basically sums it up. Okay then.

The Canadian portion of our site is in a different folder (not a .ca domain) but has the same content as the American site.  Will Google view that as duplicate content?

Vanessa: Set the geographic location in Webmaster control. If you can, put the Canadian site on a .ca domain.

How can we handle duplicated content issues when the competition steals our content?

Stephan: Make sure you link to the original URL of that article in the byline or bio so that you’re sending a signal to Google that it’s the definitive version. This way when they steal the text, all the links will be pointing to you.

Vanessa: Make sure the links are absolute for that to work.

Greg: The site with the most juice wins that battle so it can be problematic. If you’re getting riffed off by someone with more authority, that’s typically not the way it happens. Usually, you won’t even see a negative impact.

Rae: If a scraper site is stealing your content and outranking you, your site sucks. You need to step it up a notch. Also file a DMCA.

Bruce: Federally copyright your content. Bruce did because people were stealing his site. You can have a lot of problem with ripoffs when you get into syndication.  That’s usually where the site you syndicate will outrank you.

Stephan: With registered copyright, you can sue for statutory damages, not just actual damages. You don’t have to prove every loss/gain.

Do things like bounce rate, time on site, page views matter for rankings? If not now, will they?

Vanessa: If you’re talking about analytics of data, no.  She does think search engines look at how people interact with sites in the SERPs themselves. That would make sense.   If your bounce rates are high, you need to look at how engaging the site is to begin with. That’s the point of ranking.

Todd: There are way too many outside cases for all that stuff.  Someone can go to your site and then go to pick up their kids from school and stay on your site forever. It’s a perfect conversion. There are other sites where you click through, get what you want, and you’re off the site in seconds but it was a good experience.  He doesn’t think they definitively roll bounce rate, et al, into the algorithm.

Stephan: It’s more likely that Google’s going to be using Page Rank Toolbar queries than your analytics data.  They’re certainly seeing what’s happening on the toolbar server.

Rae:  She would just assume they use everything.  They say they don’t use the analytic data. Okay, maybe they don’t. But with the amount of people who have toolbars installed, the things they can tell about your Web site are huge. Assume they know and look at everything.

Vanessa: They’re like Santa Clause! [When I grow up, I want to be Vanessa. I think I’m on my way.]

Greg:  Sometimes when you get from page 12 to page 2, that’s the audition period where Google is testing if you belong. If you concentrate on reducing bounce rate, the site will end up sticking there.  Google’s testing your site against what used to be there. The sites with the best user experience will stay. [Vanessa says it stays because you created a better site, not because they were testing you. Everyone pats her on the head.]

Aaron: If you have a server that’s not reliable and your site goes down a lot, you’ll lose rankings there, too.

Bruce: He thinks its too easily spammed. You can get thousands of people from Europe to just click on your site. You need to worry about the quality of your content.

Danny: Google can look at Toolbar data, the clicks off the search and decide how to move sites based on that.  Yeah, it can be gamed, but there’s way to cross-check stuff.

My site recovered from a Google penalty. Is it safe to launch a subdomain or will I risk another penalty?  (The site was originally penalized for spammy links)

Everyone: Yes, it’s safe.

What do you think the values of subdomains are in terms of ranking power? He thinks they’re lower. Agree?

Greg: Subdomains work great. He uses them all the time. You don’t want to be overly granular with it, though. Taking your top-level categories can help sections  of your site compete better as individual sites. If you have a subdirectory for bananas, it will compete better against other banana sites.   Subdomains are also great brand defense and ORM.

Todd: It’s more about why you’re using the subdomain. If you’re using it to stuff in another keyword, it’s probably not going to help you.

Rae: If you need proof subdomains still work, go to any major brand name. Like Google.

Vanessa: When you’re monitoring other people’s Web sites, you don’t know what else is happening with their site. They may have just bought a bunch of links or they did an infrastructure change. There are so many variables that you can’t tie one thing and say THAT’S it.

Rae: You have to create a real subdomain. You can’t just throw a paragraph up there.

Aaron: You can also register the .org and .net version of your site and throw content up there.

Let’s chat about Google Caffeine.

Vanessa:  It’s primarly an infrastructure change, not a ranking change.  Due to the new infrastructure, it’s allowing new content to be able to be indexed and rise to the top. If you see any ranking changes, that may be why.

Todd: He’s seen a lot of the Universal stuff shifting around. From an indexing and search return, that makes sense.  He hasn’t seen move for clients.

Greg: They’re tracking a large volume of stuff, and there are some pretty significant changes. Not in the top three, but 4-10. It’s  hard to draw any sound conclusions because he doesn’t know what other stuff is turned on, but they’re seeing listings that used to be double indented, not be any more. He’s also seeing a stronger tend toward home pages. They have clients where they have a lot of internal pages ranking and Google takes the home page instead.

Bruce: They’ve found older pages ranking well in Caffeine. He thinks the regular Google index is being updated faster than the Caffeine index and they’re out of sync.

Vanessa:…if that’s the case, when they launch it, they’ll fill it up with the fresher stuff.

Rae: She’s seen sub pages dropping out and the home pages ranking in lower spots. She’s seen lots of weird stuff happening in Google over the past two weeks.

Todd: Keep in mind that the only people who know about Caffeine are search marketers.

Danny: Todd’s right. Search marketers are the people going to Caffeine so it is a good way to mine what they’re searching for.  It’s possible Google is just trying to steal some attention from Bing.  You can spend a lot of time trying to analyze, but if you’re really going to do that much research, you may want to go look at Bing and invest your time there.

What are the factors that influence Google Suggest? How do you get your site Suggested?

Stephan: For a long time the Google Suggest tool suggested that [digital cameras] was more popular than [digital camera] but the Google Keyword Tool said the opposite. He asked a Google rep which was true and he said to believe the AdWords Tool. Just FYI.

Aaron: We created terms using social media.  If your domain names match the keywords, that helps. You need fewer links to rank at the top, which will get you a disproportionate number of links. A lot of the Suggest stuff looks navigational.

Todd: Search volume, search citation, trends, etc.  He wants to clean up Google Suggest, not get into it.

Vanessa:  Just take advantage of it.  If you type in whatever is interesting to your site and see what’s suggested, optimize for those as well. People are lazy, they’ll use Bing terms instead of typing in things themselves.

Aaron: Anything that saves people money, that’s easy to make trend, too.  If people don’t have a coupon, they’ll search for it a lot and that will become the Suggest term for the topic.

What do you recommend to make sure a large site is crawled well?

Vanessa:  She likes XML sitemaps and thinks you may as well submit them because it can’t hurt you. She does think its not a replacement for good site architecture. just because you submit a sitemap doesn’t ensure they’ll crawl those pages but it does help them get a better picture of the site.

Todd: Everything that you can control with Google, do it. Google might screw up.

Can I use a nofollow to prevent a page from being indexed.

Rae: No. You cannot use nofollow to disallow the engines from indexing something.

Aaron: The only way to keep it out of the index is to put a noindex on the page.

Can you speak about how search engines crawl JavaScript.

Vanessa: Last year Google announced they can crawl some parts of JavaScript and now they can execute some of the code and pull in outside resources. Her guess would be they’ll recrawl pages and be able to execute the JavaScript and see that new links are being pulled in but she wouldn’t rely on it.  Vanessa wrote an article on The Searchability of JavaScript you may want to check out.

Are breadcrumbs important.

Everyone: YES.

You changed your URLS and properly redirected them, should you immediately pull the old links?

Stephan: Wait until the engines hit the redirects.

Top Flash tools to ensure that content and navigation links are being indexed? When should I use Flash?

Aaron: From Google’s perspective, Flash has a negative against it. If people see a flashing car and music, it makes Google look bad because the site is dumb.

Rae: Why would you put your navigation in Flash? There’s no reason to even talk the risk.

Todd: You come across sites that have paid a million dollars for some amazing, super awesome site and it’s all in Flash. They’ve come to you saying we need SEO help. You cannot sit in that room and tell them you’re going to have to scrap it. Cloak it.

How do you determine if a site is authoritative?

Rae: Does it rank well for main keywords?

Stephan: Does it have site links for not brand keywords?

Vanessa:  If you’re asking from a link standpoint, ask if a site that gets a lot of people to it that would want to come to your site.

How do you report spam:

Greg: It’s a waste of time. Instead, learn from it. See why it’s working and find a way to spin it.  Just because you’re at 5 and the person at 4 is spamming, doesn’t mean you’ll replace them.

Rae: If you report someone, Google’s going to come in and look at EVERYONE.  Look at what they’re doing and figure out why it’s working. Work on your own damn site.

What are the dummiest SEO mistakes you’ve seen?

Rae: Someone put up an RSS scraper site on display at a site review.

Stephan: Pinkberry had the same title on every page, all Flash, no content

Todd:  Bed, Bath and Beyond – every title and Meta tag is the same.  They’re all “bed bath beyond product”. Product should have been  a find/replace. They have 2400 lines of JavaScript above the page.

Vanessa:  Someone’s host had blocked their site via a robots.txt so it was never indexed. For a YEAR.

Greg:  There’s a lot of geotargeting in the gambling index. They didn’t realize Google indexes for US IPs, so they don’t let Google crawl it and it never ranks. Had a client accidentally use the Google Site Removal Tool.


About the Author

Lisa Barone

Lisa Barone co-founded Outspoken Media in 2009 and served as Chief Branding Officer until April 2012.


3 thoughts on “Ask the SEOs


  • Suthnautr on said:

    Regarding the use of Flash navigation, I was thinking that with Google’s relaxing a bit on not passing along page rank using “nofollow” tags, that Flash might be an option. Aaron Wall, and even more so Rae Hoffman :) indicate that Flash navigation is simply not a good idea – and Todd Friesen just plain says (if there has to be a lot of Flash) cloak it.

    I’d like to see more discussion in the area of how to allow indexing, but still control (block) the flow of link equity even if Google changes the “nofollow” rules.


  • Suthnautr on said:

    On that previous question though, I could designate a noindex folder on the site and link to redirects inside the folder that would point to files on a subdomain for pages I don’t want to pass any equity to, which would still allow those pages be indexed, whether naturally over time or artificially through outside links that will eventually get crawled.


Leave a Reply

Your email address will not be published.

*


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Comments links could be nofollow free.