Monday, October 7, 2024

Multiple Search Engine Universe

For the past three years, Google dominated the search engine field. Just 12 months ago, Google provided nearly 80% of all search results viewed on the web, including the organic results displayed by Yahoo.

Today, Google’s market share of organic search results is shrinking rapidly while the size and influence of Yahoo and MSN’s organic search increases. Add to the mix a number of up and coming search firms such as AskJeeves and the new Chinese Government owned Accoona, and we see a remarkable picture emerging, the end of the mono-culture search universe. This has great implications for SEOs and their clients as the number of essential search tools to get listings on has increased. For clients, it will be easier to be found if there are more places to find your site, and for SEOs, it will become easier to make client sites findable. A multi-engine search-universe will also protect site owners from difficult periods like the one experienced during the infamous Florida Update of last Christmas season.

Today, Google has approximately 46% of the organic search market with Yahoo (26%) and MSN (21%) following close behind. (Yahoo owns Inktomi, a database of spidered sites that currently provides results to MSN.Com. These results will be replaced by a proprietary database when the new MSN(beta) search is released.

Back in 1999, there were six major search engines, Alta Vista, Lycos, Infoseek, AOL, Yahoo and the new upstart, Google. Each search tool had unique characteristics and each depended on what were then fairly rigid keyword densities. This led to the creation of “doorway” pages, or a series of pages designed to rank well on different search engines under a set of keyword phrases. A unique doorway page would be created for each search engine, and in some cases, each keyword phrase targeted. This technique led to the obvious clogging of search engines with a lot of useless pages, “page-pollution”. While the similarities between the Big3 search tools should limit the urge to design them, webmasters and SEOs are cautioned to watch to make sure this technique doesn’t become common again.

Google Dives Deeper

Common-sense SEO once stated that Google only likes to travel to the second layer of a database, or as far as the second variable in the URL. Sometimes it would delve deeper into a site but often it would not. That was the conventional wisdom until very recently. According to SEO Roundtable moderator “projectphp”, GoogleBot is now capable of diving down to the sixth variable in database URLs.

This has fairly large implications for SEOs worrying about getting specific product listings for their clients. In the past, we were always forced to tell clients that we could not force Google deep into a database but would do our best to get visitors as close as possible. Now, it is obviously possible for a dynamically generated page to achieve strong listings, provided the requisite SEO has been performed. A note of caution though. Many databases are extremely deep and contain literally thousands of different products. SEO work on every product page may prove cost-prohibitive. There is however, some hope that effective site-mapping and very tight SEO work on basic templates will be able to provide a strong work-around to cost considerations.

Jim Hedger is the SEO Manager of StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada, StepForth is the result of the consolidation of BraveArt Website Management, Promotion Experts, and Phoenix Creative Works, and has provided professional search engine placement and management services since 1997. http://www.stepforth.com/ Tel – 250-385-1190 Toll Free – 877-385-5526 Fax – 250-385-1198

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles