Now that the long-awaited PR update has come and gone, there are reports of the Google’s indexing spider continuing to perform comprehensive site scans. Recently, Murdok ran an article discussing these types of scans and the majority of the speculation pointed towards the upcoming PR update. Now that this quarter’s update is complete, why is the Googlebot continuing to bombard sites?
Have You Been Assimiliated Into Google?
The Googlebot deep scans continue. Have you had experience with multiple deep site scans? Discuss at WebProWorld.
Theories abound concerning the Googlebots, with many concentrating on Google testing a new crawler and perhaps building a new index. A thread at WebmasterWorld devoted to the Googlebot scans revealed some interesting possibilities. One poster named Sri_gan offered these observations, “[perhaps] Google and MSN are testing their new crawlers. Possibly Google wanted to test and identify cloaking pages through the Mozilla User Agent which it could track maximum redirections as well crawl other application pages.”
Another post by g1smd said, “Looks like Google has built and published a new index based on the massive spidering that they did a week or so ago.” Poster Webdude agreed by saying, “This is what I am seeing too. In my areas, it looks like the index has been rebuilt from the ground up. I still think this has to do with a cloaking and metarefresh/302 problem the G was having. New IPs? New datacenters? New Index? Don’t know yet for sure yet.”
On the SearchEngineWatch Forums, a discussion recently popped up concerning Google perhaps doing a major update, perhaps something reminiscent of the Florida update, although Google indicated that users wouldn’t be seeing an update of that size anymore. At SEW, poster DaveAtIFG said, “Since 9/25, I’m see fresh tags updated and SERPs adjusted with EACH search. I last saw this same behavior between 11/27/03 and 12/12/03. Since then, I’ve become accustomed to seeing fresh tags updated every few days. And we saw a (toolbar PageRank) update last week. I’m convinced a major update is under way.”
I recently interviewed Garry Grant, CEO of Search Engine Optimization, Inc., and one of things we discussed was the Googlebot proliferation. During the discussion, he indicated that Google has created another spider, which would help explain this high volume of scanning. In an upcoming article discussing Google’s new spider, Garry stated:
“Only the inner circle at Googleplex knows the purpose of this new spider, but there is no shortage of theories. Some of the more popular ones include:
- • Google is creating an entirely new way for ranking pages – something that could dramatically change a web site’s position on the search engine
- • Google is trying to weed out fraudulent links
- • Google is conducting an experiment in order to improve search engine results”
WebProWorld poster Profilesite, who theorizes, “they could spider once normally, and then use this second robot to compare the content and see if they were being tricked”, supports Garry’s statements.
All of these ideas are good and logically sound theories. With Google’s close to the vest approach, we probably won’t know what’s really happening with the Googlebots until after it happens. If the second bot is scanning for search engine and PR spam techniques, in all likelihood, no one will find out until sites are banned from the index. Of course, it could be that Google is merely building another index for their new clustering technology. Only time will tell.
Chris Richardson is a search engine writer and editor for Murdok. Visit Murdok for the latest search news.