Saturday, November 2, 2024

Link Structure Beats Google’s Network Filter

If you run a network of sites then there’s a new algorithmic speculation you need to be aware of – the network filter.

First reported in WebMasterWorld and then in SEORoundtable, the network filter reportedly blocks the passage of PageRank and link popularity.

Some believe this filter looks for “unnatural” linking structures, that is, linking structures that wouldn’t occur in the wilds of the web as people discover and link to content. Barry says the filter may be triggered if, “you have 20 Web sites and all 20 link to each other on every page.”

Linking Structures Getting Caught in the Filters...Linking Structures Getting Caught in the Filters…

Barry wrote a later update to the network filter discovery called Pyramid Linking Strategies where he discusses a method for linking networks of sites that may avoid this filter.

Barry had a conversation with an SEO whose sole practice is link building. This SEO, who wished to remain annonymous so that Google couldn’t target him directly, creates networks of portals that link to related and quality sites as well as his clients’ sites. He links them in a triangular fashion.

“For example, one might create Site A, B and C. Site A will link to Site B, Site B will link to Site C and site C will link to site A. You will never find Site A link to Site B AND Site B link to Site A.”

The portal development method that Barry’s friend employs is worth mentioning here. “The portal sites will be on specific topics and contain links to high-level ODP and Yahoo! directory sites, you won’t normally find deep-level sites listed in the portal sites,” says Barry.

He also links these portal sites to his clients.

Getting on-topic links to his portal sites is much easier than getting links directly to his clients.

Once the portal’s up he begins his “complex pyramid linking structure” by linking his portals to his clients, even his portals to each other, but never his clients back to the portals.

Barry also reported recently that large sites with thousands of pages are suffering from a reduced presence in Google’s index. The network filter as Barry described it didn’t seem to remove pages from the index though – it simply removed the ability to pass PageRank and link popularity.

Barry wondered if Google’s duplicate content filter’s responsible for the dropped pages from large sites, but could it be that the site in question’s linking structure is related somehow?

Here’s the large sites discussion in WebMasterWorld.

Garrett French is the editor of murdok’s eBusiness channel. You can talk to him directly at WebProWorld, the eBusiness Community Forum.

Related Articles

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles

Contact seth hobson.