Google recently made far-reaching changes to the way it ranks search results, and the search marketing community has been abuzz with tales of woe ever since. Some have speculated that the key to understanding Google’s latest is that they’ve applied some sort of test of “commerciality” to certain phrases, roiling the waters for sites ranked well on those phrases, and leaving non-commercial phrases more or less alone. The idea is that this would cause a stampede of site owners over to the paid AdWords program, or at least making the point that Google isn’t up for providing a free lunch to clever “SEO-ized” sites indefinitely.
So is this just a theory? Or has Google composed a hit list of terms that are commercially valuable, and changed the way it ranks sites on those terms?
And if so, how does it determine which terms to go after? Some really obscure ones (yet with high bid prices in the pay-per-click program) have seemingly been affected, just as more popular terms have.
Consider this: they have plenty of data from the Adwords program to help them decide which terms are “valuable.” All the data they need, in fact, about the number of advertisers competing over certain terms, and how much they’re willing to pay to be shown on them.
But isn’t this just wild speculation? How can we know for sure?
One way is to try a few queries on the new Scroogle tool that lists how many sites on a given query have dropped out of the top 100 since the last index (I won’t go into how this comparison is possible… in fact it may not be possible for more than a few days as Google is very likely to change the playing field again).
On a query for “japanese maples,” 32 of the former top 100 links have dropped out of the top 100, including poor #23:
http://www.regannursery.com/also_in_stock/japanese_maples_list.htm
Meanwhile, looking for the latin name, acer palmatum, gives a very different result. NONE of the links, according to Scroogle, have changed. Google has done nothing to the existing rankings on this term.
So you might ask yourself: how the heck do they do this? How do they decide what’s commercial, and what isn’t?
That’s easy. They can look at how many advertisers are advertising on those terms, and how much they’re bidding. (Unfortunately most of those Japanese Maple sellers weren’t smart enough to bid on “japanese maple,” let alone acer palmatum, something they would have done had they listened to our advice!)
There’s a precedent for this – it might have actually been something that inspired Google, who knows. Infospace, publisher of the Metacrawler and Dogpile metasearch engines, came out with a method of determining the degree of commerciality first, and then assessing how many paid links to show in their mix on Metacrawler as against ordinary index (unpaid) results. I’m not sure how that filter worked, but it’s likely that it’s not particularly sophisticated but rather, largely tied to the number of advertisers in the space and the average bid on those words.
In the case of Japanese maple, there are at least five bidders on the term, whereas acer palmatum only has a couple of bidders, one of whom is selling Acer computers. If it’s only a nickel term anyway, Google figures it’s noncommercial (at least for all practical purposes), so doesn’t interfere.
For the term “arboretum,” only two links of 100 have dropped out. There are no advertisers.
For the phrase “pick your own,” zero links have changed, according to Scroogle. There are also zero advertisers.
No listings have changed for “mango chutney.” There are three advertisers. Not enough to create bidding wars, likely top bid less than a quarter.
The term “medical supply,” by contrast, has at least twelve advertisers, including some large multinational corporations. Listings on this phrase have been shaken up significantly, with 89 of 100 listings dropping out of the top 100. Although not a huge cash cow on the ad side, it’s about $1.25 to get into second spot… just around 0.75 for third position. A bid of 25 cents will get you into 5th or 6th place.
Now — who’s the new top dog on this term? Moore Medical. Not a household name, and for heaven’s sake don’t look at their PageRank, because it’s a 2. But they do seem to be something: relatively deep-pocketed.
“As a $130 million, publicly-held company with more than 50 years of experience, we have both the infrastructure and distribution network to serve our customers’ needs efficiently and cost-effectively.”
The former #1, MedSupplyCo.com, certainly seems like a decent outfit. Good site, plenty of customers, PageRank of 6. However, not as impressive in other areas:
“The Medical Supply Company, Inc. was founded in 1998. Our goal is to provide quality medical supplies at discounted prices. We work hard to keep our customers happy, and it shows. If you have any comments or suggestions for us, please feel free to drop us a line.”
The new #2 site, Allegro Medical, isn’t publicly traded like Moore, and indeed doesn’t stand out in particular. One thing you do notice, along with their PageRank of 5, is that they’re a Google advertiser.
So maybe Google’s process is essentially a two-stage one:
1. Target “commercial” queries for a “roiling of the waters” – re-rank all sites falling under certain “commercial” queries, depending on the perceived value of that term as measured by its value within the AdWords program;
2. Attempt to make a global judgment as to which of the following “types” a site or page falls into:
The presumption here is that “companies” deserve to keep their listings and have every right to explain what they do, enjoy strong inbound links, and have a “presence.” But the closer something gets to being a “store,” the more sensible it is to make the purveyor pay for a listing. Otherwise, the free listings just become an endless playground for SEO’s to squeeze free money out of gaming Google for top rankings.
It has to be something like that. It’s certainly no mere “speculation” that Google is up to something along these lines. There is enough evidence that points to certain patterns over and over again. The shortest path to the truth seems to be the following rule: “where there is a critical mass of advertisers, Google has chosen to re-rank the index.”
Admittedly, this is an oversimplification at best. Certainly there are other characteristics of the re-index that look more like past spam blitzes: sites which have aggressively pursued link swaps or keyword-rich domains, for example, have apparently been big losers in the latest blitz. Armchair sleuths are busy at work trying to unravel what it all means. I suspect there is more continuity than we would like to admit. Google is doing what search engines have been doing for years: studying common SEO techniques and trying to ensure that clever marketers don’t get the upper hand in the “free” index. What’s different is how much is at stake if Google can indeed dislodge the best-laid plans of free-riders at this crucial stage. Next time, maybe they’ll think twice about shunning the advertising program at this time of year.
I think we should, at least, put to rest the idea that there is any overt or covert “ranking reward” for being an advertiser. The re-ranking is based on principles that may have been espoused within Google and all search engines for years. We’ve seen trends towards giving more top-ten listings to sites that involve discussion, comparison, content, resources, etc.. It’s just been accelerated now, and made more aggressive.
Another revenue angle to consider: Google now actually stands to make money from quality content sites which get ranked well. That’s because many of these sites now show AdSense ads, and Google gets a little revenue share from them, too.
It’s ingenious, really. Google has figured out how to get paid much more than the zilch they used to get paid for running a search engine, whether users click through to commercial listings or quality content, and yet without assaulting users with irrelevant, commercialized search index listings. Quite the opposite, actually. Arguably, relevancy is just as much in evidence as it always was in the main index. The listings for “acer palmatum” and “icthyology” haven’t budged.
So does this mean SEO is dead? Far from it. Responsible search marketers have always been able to generate a mix of stable top 20 rankings on popular terms (by doing PR the hard way and generating real, not manufactured, online word-of-mouth), along with a healthy mix of top five rankings on less popular terms, by highlighting rich, varied, related content that intersects with the highly specialized search queries users type in all the time. The techniques for doing that won’t be “caught,” because there is nothing wrong with having rich, unique, varied content. (Or products that hardly anyone else sells!) Advertisers who have always struck the right balance between paid and unpaid listings, between interesting content and company information, and commercial pages, won’t have been caught flat-footed by this latest diabolical Google reshuffle.
But plenty of cocky search-marketing cowboys — those who felt a growing sense of entitlement to rankings they “earned” for their sites on the strength of little more than a few cheap parlor tricks — have been knocked to the dirt hard in this latest dustup. Most will pick themselves up, knock back a couple of shots of the hard stuff, and prepare themselves to fight again. But some won’t. Some of Google’s most notorious freeloaders may finally gather their things, put their crumpled black hats on, and ride unsteadily off into the sunset, never to be heard from again.
That’s the hope, anyway.
“We’re Always Trying to Improve the Index for the User”: Google
The problem with rampant speculation is that it’s usually at least as damaging as the behind-the-scenes shenanigans you’re trying to speculate on. Recent armchair attempts to explain the master plan behind Google’s recent index update, mine included, are no exception.
So, lest I be accused of being the Oliver Stone of search, after talking with Peter Norvig, Google’s Director of Quality, I’d like to clarify a few things.
I’ve always known and believed that there was no relationship between Google’s advertising program and its index results, absolutely none. After working so closely with so many advertisers, it would have been pretty obvious if we’d been getting some sort of positive “spillover” effect. Many of us weren’t suggesting this sort of relationship, but it *wasn’t* out of line to make the point that a significant reshuffle at this time of year does make many non-advertisers aware of the fact that they might have become too dependent on free listings. Google doesn’t have to foster or maintain a relationship between the right and left-hand sides of the search results page to benefit from the fact that both sides are in constant flux.
[Disclaimer, if that’s even the right word: my company benefits, too, since we help people figure out how to make their dollars go farther on the right-hand side of said page.]
Google’s not unaware of this. A closing comment from spokesperson Nate Tyler seemed to contain something of a pointed message in this regard: “People need to be aware that the Google index was not designed to be a predictable way for companies to get traffic, although, of course, if you type Amazon, you’re pretty sure to see Amazon.com up at the top, since that’s clearly the most relevant result.” Unsurprisingly, Google would rather have you as an advertiser than not, and if the threat of unstable, unpredictable index rankings for private-sector actors is enough to convince more of them to finally invest dollars in AdWords, then so be it.
One area that I did exaggerate a bit in my article, but again, not without some good reason, is the fact that Google can certainly collect information about the financial value, to Google, of certain search queries as those queries are monetized through the ad program. I might have mis-guessed as to how such data might be used — and I certainly wouldn’t want to suggest any kind of systematic relationship between ads and index — but it’s certainly the case that Google is at least *in possession of* information telling them which queries are commercial, and which aren’t.
But that’s neither here nor there.
According to Norvig, Google is “always making changes to its index, and it measures the quality of results before and after.” One explanation for the current hue and cry, in Norvig’s view, is simply that “Google went for a period of several months with no major changes, and some webmasters got complacent about their search rankings to the point where they felt deserving of them.”
One point to make is that changes to the index don’t always affect all queries equally. In rolling out product improvements like showing results with “stems” and “plurals,” some queries are affected and others aren’t.
The most recent enhancement, says Norvig, can be boiled down to “attempts to give the correct value to a page.” This is what caused problems for so many sites who had managed to climb high in the results — higher than their sites warranted — by exploiting search optimization tactics. In short, in large part, this was in part your run-of-the-mill anti-spam re-ranking, but also, Google may have begun down the path of incorporating new cues to a site’s quality or relevance to make the results that much more useful to the public.
“We used to look at just links and keywords, but now we’re incorporating a lot of other stuff… looking for more and more signals and types of information on a page that attempts to determine or read a ‘real meaning’ or what a page is trying to provide,” continued Norvig.
He acknowledged that some of my speculation, the part where I suggested that Google was making more effort to discern the “type” of information on a page (resource/discussion/information, store/affiliate, company, etc.) “was heading down the right path.” Norvig even went so far as to agree that the type of thing Google “might” do would be to look for information such as “how long a company has been established, what kind of information is it showing to the site visitor, etc.” It’s safe to say in such a context that those traditional bastions of SEO, the hastily-assembled “microsite,” would have trouble cracking a top ten listing under this type of formula. But wasn’t PageRank supposed to be immune to that junk anyway? Is Google quietly admitting that they’ve got to layer more and more tests of quality into their algorithm because they’re powerless to stop the growth of link farms and superfluous reciprocal linking?
And although I’m satisfied with Google’s ongoing efforts to achieve higher quality, at this point, it looks as if the quality of listings is more predictable on non-commercial queries.
Because, after listening carefully to every possible factor that Google might take into account in judging quality and relevancy, when I type “fruit basket” into the search box, I’m still confused.
http://www.google.com/search?sourceid=navclient&q=fruit+basket
This is the top-ranked site on that query:
http://selectsmart.com/FREE/select.php?client=Haru
The #2 site is no work of art, either:
http://www.geocities.com/ai_no_hyde/Fruits_Basket.html
The #4 result for “fruit basket companies” is a bit of spam that should be caught easily:
http://www.newpurple2.com/wireless.htm
One thing is clear. This won’t be the last Google Dance. The next one can’t come too soon for many webmasters.
Click here to sign up for FREE B2B newsletters from Murdok!
Andrew Goodman is Principal of Page Zero Media, a marketing consultancy which focuses on maximizing clients’ paid search marketing campaigns.
In 1999 Andrew co-founded Traffick.com, an acclaimed “guide to portals” which foresaw the rise of trends such as paid search and semantic analysis.