It seems it was only a matter of time before the cleverer element of the SEO world developed a workaround for Google’s penalizing of paid links. The workaround involves a pretty creative “dynamic” linking strategy, and it’s playing a little bit dirty.
No longer the province of tax accountants, lawyers, and politicians, an elaborate loophole has been developed by Andy Beard proposing how to get around Google’s paid-link vigilance via robots.txt and paid reviews.*
Beard’s explanation is complicated, lengthy, and loaded with historical context so visit Beard’s blog for further clarification, complete with nifty diagrams. What we will provide here is an overview and basic introduction, and not necessarily an endorsement.
Beard’s proposal (or as he describes it, a red flag in the face of the charging bull) involves strategic use of robots.txt to redirect Google crawlers away from paid reviews. This is intended to take the penalty sting away as Google can’t penalize for what it’s not supposed to crawl in the first place.
In addition to the paid review that is blocked from crawlers, the author creates a follow-up review at another domain that is not paid and links back to the original review, with link juice in-tow. According to Beard, a client would pay for the original domain link, but the not the follow-up on a separate domain (but I imagine the price just got higher, huh?).
The link on the paid review is not a nofollow link, meaning that it will still also pass PageRank since Google shouldn’t know or care about it if it can’t be crawled, and the link on the follow-up review is also not a nofollow because it’s, technically, not a paid link.
In theory, the original, blocked review will still pass a reduced amount of PageRank because Google still links to “dangling” pages, or pages it can’t see, if there are backlinks pointing to the page. The link juice it passes, however, is reduced, as is the link juice coming from backlinks to it. What happens next is a matter of determination and scale.
With enough backlinks (according to my understanding), especially authority backlinks, the decrease in link-juice can be overcome, thereby raising the blocked page’s PageRank eventually, which is then passed on to its intended paid review/link recipient.
Phew! So, it’s kind of like link-laundering.
Your first objection is probably that Google’s pretty vigilant about link-spam, too, and bursts of low-quality links over a short period of time will raise the spam alarms, thus either earning penalties anyway or negating the collective power of those links.
Quite right, which is why Andy has a plan for that too. This is where it gets a bit harder, since it involves a real commitment to getting that paid link some good juice to pass along. But it probably should be a part of your overall web-marketing campaign already and anyway.
Beard proposes getting authority links via:
Social bookmarking: A short description, a title, and a link from BloggingZoom, Digg or other social site is all that is needed to carry a decent, relevant amount of link juice to the target.
Targeted RSS syndication: Syndicate the article, make sure it links back. Send to “hub pages” on content sites that accept syndicated articles via RSS (because Google won’t be looking in RSS feeds, either). Aggregators (which will index a snippet and a link) like Technorati also make use of RSS feeds.
Authorized and unauthorized article syndication: Beard syndicates his articles to other publications with high PageRank. Link back to an un-crawled page from there and you’ve given it some much-needed power. What he calls “unauthorized syndication” we usually call “scraping.” On the bright side, publishers can make the most of scrapers by not making a fuss, and instead requiring a link.
Targeting Universal Search: Use images, video/audio descriptions, etc., in unpaid content (which is also syndicated, I assume, to sites intended for that type of format) to point back to paid content.
If Google doesn’t find a way to penalize, it could be a viable (if involved) strategy. But it is also more akin to traditional web marketing—taking advantage of the channels you have to promote.** It’s doubtful that less legitimate paid linkers will take the time and effort to promote this way, but you have to admire Beard’s never-say-die attitude.
*This all hinges, of course, on whether it will work and for how long, and how much you rely on Google as a search-traffic generator. The hard truth is that Google is the defacto search engine on the Net, so making el Goog happy whether or not you agree with el Goog’s decrees is an important part of the game. And nobody likes unhappy el Goog.
**Google’s penalties seem also to be forcing webmasters to do (nearly) legitimate content and marketing work, which is an interesting side-development.