Saturday, December 14, 2024

Finding the Strongest Links

Share

Given that many large brands and mainstream media sites are trying to leverage their brand strength by adding interactive content to their sites and every SEO blog in the world (and some from distant universes) have posts about leveraging social media and buidling trust with link baiting, it is probably a pretty safe bet to think that Google is going to be forced away from trusting core domain related trust…and it is going to have to get even better at filtering link quality as well.

You know Digg spaming is mainstream when there are many different business models based on spamming Digg.

Other social media sites are not behind the curve in getting spammed to bits. I recently noticed spam software for mass submission of videos to video hosting sites, and I see del.icio.us and Technorati pages ranking everywhere, and when I look at Del.icio.us I run into tags like this

Wow. Garbage.

When you look in Google’s search results for long tail queries in consumer finance or other profitable verticals you see many sites rank which are various flavors of forums, user accounts, xss exploits, and other social spam. In spite of Yahoo! being the most visited website compare Google’s recent stock performance to Yahoo!’s. Given that content as a business model does not scale well, traditional monopoly based content providers are going to have to work hard to get users to create / add value to / organize their content. As they do, many of these types of sites will make it easier and easier to leverage them directly (easy to rank content host) and indirectly (indirect traffic and direct link authority) to spam Google.

The brief history of SEO (well as best I know it) sorta goes something like
matching file names
page titles and meta tags
keyword density
full page analysis
link count
pagerank
anchor text
manual and algorithmic link filtering
duplicate content detection and semantic analysis
delaying rankings
delaying indexing
and now we are up to site related trust…which is getting spammed to bits and will only get worse

Anything that has been greatly trusted has been abused. The difference between the current abuse and past abuse is that in the past it was typically smaller individuals screwing with Google. Now Google has become a large enough force that they are actually undermining many of the business models of the providers of the content they are relying on.

Going forward, especially as Google, small content providers, unlimited choice, and easier access to the web marginalize the business models of many of the sites Google currently trust those sites are going to rely on users to help foot the bill. Google will give some content providers a backdoor deal, but most will have to look to user interaction to add value. That user interaction will be spamville. Thus I think rather than just trusting core domain levels I think Google is going to have to reduce their weighting on domain trust and place more on how well the individual page is integrated into the site and integrated into the web as a whole.

If everything Google trusts gets abused (it eventually does) and they are currently trusting raw domain related trust too much (they are) it shouldn’t be surprising if their next move is to start getting even more selective with what they are willing to index or rank, and what links they will place weight on.

Jim Boykin recently announced the launch of a his Strongest Subpages Tool. Why the need for it?

If you’re getting a link from a page that no other site links to (beyond that site), what is the true trust of that page?

However, if you get a link from a subpage, that has lots of links to it, and your link is on that page, there’s outside trust flowing to that page.

If you’re getting links from pages that only has internal links to it, I doubt there’s much value in it.

Jim’s tool has been pretty popular, so if you have trouble accessing it don’t forget that you can do similar with SEO for Firefox. Jut search Yahoo! for site:seobook.com -adsfasdtfgs, where adsfasdtfgs is some random gibberish text. That will show you how well a page is integrated into the web on many levels…page level .edu links, external inbound links to that page, etc. etc. etc. You can also go the the Yahoo! Search settings page and return 100 results per search.

Comments

Tag:

Bookmark Murdok:

Aaron Wall is the author of SEO Book, an ebook offering the latest
search engine optimization tips and strategies. From SEOBook.com Aaron
gives away free advice and search engine optimization tools. He is a
regular conference speaker, partner in Clientside SEM, and runs the
Threadwatch community.

Table of contents

Read more

Local News