Misconfiguring Robots.txt could be fatal

Misconfiguring Robots.txt could be fatal


As one of the 50+ elements used for SEO evaluation, SEO MASTER Express checks the description for the robots.txt file that controls the crawler in the top directory on the server.


What is the robots.txt file and how does its description control crawlers?

The robots.txt file consists of a detailed description intended to control a search engine crawler. It is primarily used to deny the indexing or registration of a webpage in a database by a robotic search engine (crawler).

Proper configuration involves first creating the robots.txt file with a description that provides instructions for how crawlers are controlled within a website. This file should then be uploaded to your website’s top directory on the FTP server. The robots.txt file is typically set to control a crawler and prevent it from indexing unnecessary pages in order to increase the efficiency of the crawling process within the website.


Example of how to write a robots.txt file

A robots.txt file is described in a text file as follows:

User-agent: Googlebot
Disallow: /seo/
Disallow: /user_data/


Common mistakes in writing a robots.txt file

Except when it is intentionally created to control a crawler for a particular file or directory, controlling a crawler will interfere with SEO. For example, if you mistakenly set the robots.txt file to block an entire directory, all pages immediately below that directory would no longer appear in the search results.


Robots.txt is sometimes pre-set in a page

Robots.txt is sometimes pre-set in a login page or a shopping cart page (not the product page, which appears directly after you click a purchase button) depending on the rental server or CMS.


Login page

When a login page is displayed in a search result, the risk of displaying incorrect login information will increase. For this reason, generally it is assumed that the crawler is blocked from indexing a login page as the default setting. In this case, you do not need to modify the robots.txt file.


Pages in the shopping cart

It is also assumed that the robots.txt file is pre-set not to index shopping cart pages in order to increase the crawling efficiency of a search engine robot. In this case, you also do not need to modify the robots.txt file. Shopping cart pages are often dynamically generated with a specific parameter to prevent crawling. If all shopping cart pages are being crawled, the efficiency of crawling for a website will drop significantly, slowing down crawling for the more important pages.


Check your robots.txt description and control crawlers

SEO MASTER Express identifies errors and provides advice when there is a problem with a robots.txt file.
*If you have intentionally set a robots.txt file to control crawlers so that they only crawl a particular file or directory, you don’t need to fix it.


number of repeats of keywords in meta descriotion tag






Download 14 Days Free Trial Now!


seo-master-UI

Stay connected with us in your favorite flavor!