Saturday, December 14, 2024

Fighting Google’s Paid Link Stance With Robots.txt

Share

Andy Beard has hit upon a compromise between removing text links from his site and being summarily punished by Google for selling them.

By banning Google’s crawler, Googlebot, from pages that might offend the search ad company, Beard said he is “lowering the red flag.”

Fighting Google's Paid Link Stance With Robots.txt

“I have spent a long time deciding on a course of action, and have decided that blocking my content using Robots.txt is ultimately better for me, and better for people hiring my services,” he said.

“It also happens to be worse for Google than currently, but that is the beauty of this strategy.”

When he writes a paid review, Beard plans to place an entry in his robots.txt file to keep Google from crawling it. If Google doesn’t index the page, the company should have no issues with the review potentially affecting search rankings.

Beard has a method to his seeming madness, namely several options for syndicating his content to reach an audience: social bookmarking, hub pages, authorized and unauthorized syndication, indexed search results & aggregators, and multimedia to attract universal search.

“Nofollow is not the answer to Google’s troubles,” Beard said of Google’s preferred method of treating links. “As Google seem determined to impose the letter of the law rather than the spirit, I can’t see a reason why I shouldn’t sidestep the charging bull.”

If his experiment goes well for his site and the companies he reviews, we won’t be surprised to see more bloggers emulate his strategy.

Table of contents

Read more

Local News