There has been a great deal of speculation about a shift coming in Google’s ranking algorithms over the past few weeks.
Several recent pieces of evidence point to a coming shift however it is very difficult to predict what, if anything will happen. It’s pretty much a given that there will be some form of update to the way Google reads and ranks sites. Google actually makes minor adjustments on a regular basis. There are rare occasions however when major changes are introduced. I suspect this is one of those times.
Most webmasters will remember last year’s Florida Update which turned Google’s rankings upside down for about eight weeks. That eight week period caused a great deal of turmoil for SEOs, small businesses and web masters. The Florida Update was introduced on November 15, just six weeks before Christmas and at the start of the most important season for retailers. If Google does update it’s algorithm in the next few weeks, another sudden round of “placement dislocation” may occur thus frustrating online retailers desperate for online Christmas sales. While it is impossible to predict such an update with 100% accuracy, there are a number of simple steps webmasters and SEOs can take to protect their clients in the case of a major update.
Let GoogleBot do it’s thing
While Google’s ability to contextualize information found on a specific page is extremely complex, the primary way it gathers information from each page it visits is extremely simple. Google uses a hyperactive spider known as GoogleBot. Like all well-evolved search spiders, GoogleBot lives to follow links. Every web page online that does not prevent spiders from traveling through will be found by Google, provided there is a link directed to it. If a new site has a link coming to it from a site in Google’s vast index, GoogleBot will find it. If you add a new page to your preexisting website and provide a link to that new page, Google will find it. It is really that simple. In the past few years, I have not submitted a client’s site to Google. I have simply made sure there was a number of links directed to it from other websites and let GoogleBot do the rest, as it is programmed to do.
K.I.S.S. – On-site Tips
I am a big proponent of the keeping it simple, Sam. Like nature, search engine spiders love simplicity in all its forms. That said, we need to recognize simplicity is simpler said than done. While modern website design calls for increasingly complex technologies and commercial site design often relies on the use of multiple databases, there are ways to give spiders the information they require in the simplest possible formats.
Information Organization
Keeping a site simple is a matter of information organization. As much as possible, try to address only one topic per page. Your starting point is the HOME page of the site. If you have only one product or message, the job is relatively easy. If you have several products or a wide variety of information you need to express, organize that information as succinctly as possible. Channel various information streams into sub-directories. The key here is to be as focused as you can be when presenting information to search engines by distributing different blocks of information across as many pages as necessary. The days when a single page could represent several products are long-over.
Optimize your content page by page
One of the reasons SEO can be expensive is that a good SEO will work on every spiderable page in a website. This can be painstaking work but if done properly, it always pays off in the end. The scope of the job can be daunting though, especially when you consider the vast size of many websites today. A smart way to look at presenting information to search engine spiders without getting overwhelmed is to break the site down to basic components or elements. The most important on-site elements are Titles, Tags, Text and (internal) Links.
Titles
Aside from the URL, the title is the first on-site element seen by a spider. Titles play an important role in keyword contextualization for Google. Placing keyword phrases in the title of a page is extremely important to achieving rankings under that keyword phrase. Webmasters and site design planners need to remember that each page in a site presents specific information and thus requires a unique title. If you are following the single-focus per page tip, titles are usually pretty easy to formulate. Take the two strongest keyword phrases that apply to that specific page and place them beside a keyword phrase that describes the overall site. For example,
Blue Widgets :: Preformed Building Widgets :: Construction Supplies
Tags
Meta tags do matter, however there are only two essential meta tags that should be included on every page in the site, the Description and Keyword tags. Of these two, the most important is the Description. The description tag provides GoogleBot with primary topical information about the content of that specific page. A good description is often two sentences long and uses target keywords and phrases. The keyword tag is of lesser importance but does play a small role at both Google and Yahoo. Don’t spend an inordinate amount of time on this tag but be sure your target keywords and phrases are mentioned at least once and not more than three times.
Body Text
The text is the most important element on the page. The most important reminder regarding text is to focus on one subject or topic per page. Make sure you use target keywords and phrases in the text on the site and make sure that text is well written. It is often wise to place a keyword enriched sentence at the very top of a page, above the company banner. It is also important to know that GoogleBot reads a page much like you read a newspaper, in columns starting in the top left and flowing to the bottom right. If you are using tables or CSS, remember how GoogleBot likes to read. Keyword enriched text should be presented early in the page.
Directing GoogleBot
Understanding the behavior of GoogleBot is important at all times but if there is a major update, it will be important to allow GoogleBot free transit in order to have as many pages in your site spidered as possible. Making pages spiderable is not much more difficult than offering a text link to each page in your site. The most efficient way to open the site up is the use of three different levels of internal link mapping. While that sounds somewhat complex, it is quite easy. Across the bottom of each page in the site you should establish text links to the index or default page of each sub-directory. This is called the “root navigation map”. Across the bottom of each page in a sub-directory, you should put a link to other pages in that sub-directory. These maps are called the “sub-dir navigation maps”. Lastly, you should create a unique page at the root level that acts as an overall sitemap and place a link to it in all sub-dir navigation maps and the root navigation map. The sitemap page should be designed in basic text and have a link to every page in the site.
Anchor text used in links
Most keyword targets can be expressed in short two-word phrases. These short phrases should be used as the anchor text in your internal links. For example, the Blue Widget Construction Materials Company might phrase a link as such,
<a href="http://www.blue-widgets.com/construction-widgets.php"> Construction Widgets</a>.
Note the anchor text would read Construction Widgets.
Off-Site Tips
Aside from your competition, the only major off-site element recognized by Google is incoming links. This is an extremely important element as Google bases your site’s ranking largely on the number of related sites linking to you. There are two factors to consider when examining incoming links, numbers and relevancy. The more relevant links directed to a site, the better that site will rank. This has always been the basis of PageRank and knowing that, many SEOs have undertaken a brisk trade in establishing, purchasing or selling links. Some have even started businesses creating pages designed to be relevant in Google’s eyes in mass-market sectors such as health and travel in order to create links as commodities.
Since Google doesn’t like being gamed so easily, the value of incoming links is where Google engineers do most of their tinkering. As stated early in this article, Google is getting much better at figuring out specific page topics and overall site context. They are continuously applying advancements in contextualization to their link-relevancy algorithms. My guess is this algorithm update will target unrelated links, lowing their value considerably while rewarding sites with highly relevant links. Google will also likely target companies and sites that exist solely to sell links, thus effecting all associated with such schemes. If your business is based on selling links, watch this update very carefully. If your website has purchased links, you too should watch this update very closely. Unfortunately, when the first signs of trouble come, it will already be too late to take any action aside from removing irrelevant links wherever possible.
Even if there isn’t a major update in the next few weeks, these tips are basic common sense and are based on Google’s current behavior. Using them can only help your site. If there is a major update, being ready is better than being surprised. The problem with predicting a major Google update is that the prediction is made in weeks in advance of the event and, of course, may not happen at all. Recent evidence and previous history point to an imminent update so consider this article your ounce of prevention.
By the way, Happy Holiday Season to all online retailers. Remember, it is only twelve weeks ’till the season.
Jim Hedger is the SEO Manager of StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada, StepForth is the result of the consolidation of BraveArt Website Management, Promotion Experts, and Phoenix Creative Works, and has provided professional search engine placement and management services since 1997. http://www.stepforth.com/ Tel – 250-385-1190 Toll Free – 877-385-5526 Fax – 250-385-1198