Does anyone know the real title of this session? The PubCon Web site says “Proactive Link Campaign Tactics”, the PubCon booklet says “Link Building 2011″ and elsewhere it’s listed as “Link Building 2012″. I have no idea. Either way, the speakers are the same and I’m pretty sure we’re gonna talk about link building tactics. That’s really all you need to know, right?
Up first is Glenn Cooke. [Okay, I may have originally typed that "cookie". Don't judge me.]
He doesn’t tell people what he does. He has an online business. He doesn’t do SEO for hire, he just does SEO for himself. He doesn’t have formulas, he doesn’t have tools, etc. He’s going to tell us what he does with some specific examples.
He asks if there are any Canadians in the crowd, eh? Industry Canada is a PR 7 free directory. As we will find out later in this session, Glenn loves directories.
The first link building tip he ever got was back in 2003. He walked outside of the hotel, spoke to a shaggy looking guy in a BOTW hoodie and asked him what PPC was. They got talking about .edu links and the guy told him to go find a university or a college club in your niche and offer to sponsor their next meeting – buy them pizza and beer – and get a sponsor link from them. Easy way to get a .edu link.
Link Building Theory
He’s looking for relevant and authoritative. Relevant sites are on topic and have backlinks from other relevant sites. Authoritative sites are sites that have links from other authority sites.
He breaks his link building ideas into two genres – Who and Why.
Who: He finds untapped directories in his niche. He’ll look at everyone who’s upstream of his product or service and develop a list. Everyone who buys his product gets broken down into groups – grandmothers, hunters, etc. If you decide that grandmothers are the people who buy your product, you can go looking for backlinks. Identify the different niches who buy your product.
Why: What’s the benefit to them to link to you? He does study and comparisons between Product A and Product B and then he goes looking for people who mention Product A or Product B and pitches the study. He has a certain success ratio for that. Calculators are a great thing. Build a calculator and put it on your Web site. Email people who are looking for that calculator. Or just give away the calculator ONLY to the people you want backlinks from.
He went and bought a whole slew of books that were out of copyright, scanned them, and put them on his Web site. Not only does that give you thousands of pages of high quality site content, now you have a historical research site. That will help you get links from .edus, government sites, and other sites that won’t give links to commercial Web sites.
The easy links:
- Directories – DMOZ, BOTW
- Directories gov’t niche local
- Use Google, plumbers site:.edu
- Excel spreadsheets – If you get a link from an Excel spreadsheet download, that counts. And he doesn’t think you can even nofollow. To find them, do a file type search located in Google’s Advanced Search.
Getting Links from DMOZ:
Become the editor for the town where you live. Once you’re in, clean up your little city/town and you’ve got your link from DMOZ.
Call your competitors located in different states. Tell them you see they’re doing SEO and offer them tips you learned at PubCon. Tell them in exchange for telling them how to get a link from DMOZ, you want them to give you a link from their site. [OMG, seriously, people? This is what we've got in 2011?]
The Link Hook is his new idea. Ready? You leave bait out but there’s a hook outside of it. What this means is structure something where site owners want something from you. Then you come up with a product or service, but it’s only available if they give you a link.
Next up is Russ Jones.
He’s going to say the absolute opposite at what Glenn just said. He’s going to talk about what agencies are doing to better scale link building. Shit’s about to get real, he says. We’re going to talk about math. Oh dear. NOT MATH!
The Maturation of Link Building
Metric Conscious Link Building
Actually measuring the mathematical value of potential links, the statistically-determined requirements of ranking and the use of penalty mitigation
The formula he shows uses Raw MozRank , the number of links, and the order of the link on the page to determine how much MozRank it can pass. There’s a crazy formula on the screen, but I can’t possibly get it down. It looks like hieroglyphics. Sorry.
We have to determine what is necessary to rank for any given term and any given Web site. We have to get smart about how it is we avoid penalties. To date, we’ve avoided penalties by being cowards and received penalties by being too bold. These kind of metrics are becoming the standard.
Natural Link Profiling: Looks at link depth (are a disproportionate number of your links coming from home pages), link colocation (how many other links are there pointing to your site from the same pages on average?) and link proximity (how many links are near yours?). He wrote about The Wikipedia Model for SEOmoz recently that touches on all of this.
There are tons of patterns like this that Google is using to find natural link profiles.
Computer Assisted Link Acquisition
The proliferation of tools, both public and private, to assist in the accumulation of relevant, juice-passing, non-footprint links.
1. Prospect advertising targets
2. Solicit contact information
3. Contact and, if paid, negotiate
4. Track measure and repeat
Prospect advertising targets.
- Using private tools or public tools like Ontolo to find potential, relevant link sources
- Using metric sources like SEOmoz Site Intelligence and Magestic SEO API to qualify link targets
- Using predictive analysis to determine sites that are likely to be receptive to link placement. What are the characteristics of pages that will give me a link? What are the characteristics of pages that never give me links?
- Crawling pages and sites to ID link networks or other negative factors. Is this prospect’s site getting all their links juice from an internal network? Are there non-quality links on this page?
Finding Contact Information
- Using public labor pools to get access to contact info of Web sites
- Use home-brewed spiders
Contacting, Begging and Negotiating
- If paid links, using metrics to predetermine the potential value of the link and, thereby, IDing the offering price for negotiating link buys
- Building link value models on market prices for known networks like TLA
- Split testing everything – name of individual sending the link request email. Pitch used to get links. Email deliverability measurements.
Track, Measure and Repeat
- Using public tools like RavenTools to track links
- Or, better yet, use internal links to track link placement up time, changes in outbound links on the page, PageRank and mozRank changes on page
- Using predictive tools to determine whether an individual link acquisition will push a iste over acceptable anchor text optimization thresholds. Combine the mozRank Passed formula and known anchor text balance to determine link source will be over powering.
Shit’s ’bout to get real
- No more networks if you want long term stability: It’s an unnecessary risk.
- No more spaghetti against the wall: The data is there to help you make decisions.
- No more excuses: The experts are truly setting themselves apart from those of you that are making them.