Wednesday, April 12, 2006

Google’s quality after jagger update

have you noticed something different with Google recently? After its Jagger update the Webmaster community has become panicky. But what’s the fun of being panicky, when the whole idea behind any update is to enhance the quality of the search results. If a site is relevant, then it deserves to appear high in the search results. Also Google isn’t in a business of earning commission for the sites that appear in search results. Its only goal is to give users relevant search results. And if webmasters are providing relevant sites, they will certainly appear higher in the results. For roughly two years Google has incessantly presented a series of algorithm filter changes, which have led to the unforeseeable search engine results, owing to which many clutter-free websites (not-Spam) were released in a row. That was the time when the updates of Google were monthly, which became a quarterly feature after some time. Another quarterly feature is inclusive of the Big Daddy update, which is an update of Google’s infrastructure as well as an algorithm update. It is Big Daddy’s courtesy that the pages seem to go from a first page ranking to a spot on the 100th page, or worse yet into oblivion. Google algorithm changes started in November 2003 with the Florida update followed by Austin, fine brandy, bourbon, and Jagger. Now the SEO professionals and Webmasters are busy treating BigDaddy!

The algorithm updates mainly deals with canonical issues, duplicate content issues, the Sandbox effect, and supplemental page issues. The canonical issues arise when search treats the URL of your site when suffixed with www.abc or prefixed with abc.com, or acc.com/index.html as three different websites. When google cache them, it considers the replication of content and accordingly penalizes them. Getting penalized means losing ranking of all the webpages. Search engines such as Yahoo and MSN are not finicky while dealing with these kinds of issues.

The Sandbox effect affects almost all new websites by placing them on a preliminary "probation" status. All new websites after becoming search-engine friendly and being indexed may perform well for a couple of weeks, but when the filter is applied to the new website (called sandboxing) the site will still appear in the result pages, but it will not rank well despite of how much unique, well optimized content and how many quality inbound links the site may have. The theory behind the Sandbox is that Google knows that a 100,000 page website in not one day task. So it is a kind of time penalty for new links and sites before getting indexed completely.

Replicating content is a black hat SEO tactic and a major issue on the Internet. Some incapable webmasters duplicate web promotional articles on their websites under their own domain name. Google has severely retaliated to this abuse by introducing Google content penalty. But in the process Google has penalized the legitimate ones by considering them to be the duplicates. Google may have to consider this update because a news story is often covered by many websites. About the web promotional content, Webmasters can track the sites which collateralized their copyrights and submit a spam report to Google. But filtering of news story will unavoidably catch some legitimate exploitation.

Supplemental Page Issues also known as Supplemental Hell has been bothering the webmasters for over a year. The Supplemental index is a kind of graveyard for the webpages that are old and have received errors. Once they are deemed inactive seldom they become live. And the worse is no one disputes over the need for a Supplemental index.
The only solution is - Be ethical.

0 Comments:

Post a Comment

<< Home