Sunday, November 25, 2007

Optimizing Your website for Different Search Engines: Using Robots.txt File

If you think that you are ranking well on one search engine for some keyword or phrase and by default will rank well on other search engines too, then you are highly mistaken. All search engines have their own different methods, ways and algorithms to rank the sites. Now you want to optimize your site for different search engines, how do you do that? It is not so simple as just creating very similar pages with slight differences here and there. If you do this, Google's spiders will consider it as spamming and may ban your site forever or give you very low rankings. So what's the solution? How do you optimize your pages in order to satisfy and impress different search engines? Well, the answer is Robots.txt file.

It is a text file created by using a simple text editor like Notepad or Wordpad so that your site does not get banned or punished by the search engines for designing different pages for different search engines. This file stays in the root directory and in order to stop some search engines to not view or index certain web pages of your site that don't stay in your root directory, you need to point towards the right directory and then list the file as normal.

The code to insert in the file is:

User-Agent:(Spider Name)
Disallow: (File Name)

The words before the colon never change and are mandatory, but the words after the colon will change according to the file and search engine that you want to ignore or avoid. User-Agent defines the name of the search engine that you want to disallow visiting the files that are mentioned in Disallow category. In case you want to list more than one disallow files, they can be listed one under the other. For example, if you wish to disallow Inktomi to view or spider two pages optimized for Yahoo and three pages optimized for Google, you can do so by placing these files one under the other.

Robots.txt files can be your saviour, if you are optimizing your web pages for more than one search engine, but you need to be very careful while listing the names of those files as a small error on your part can turn into a blunder. So, go ahead and optimize your pages for different search engines without the fear of getting banned or punished for spamming by simply creating robots.txt files.

http://www.segnant.com/

0 Comments:

Post a Comment

<< Home