What is the robots.txt file used for? Print

  • 0

The robots.txt file is a text file located in the root directory of the site, in which special instructions for search robots are written. These instructions may prohibit certain sections or pages on the site from being indexed, indicate correct “mirroring” of the domain, recommend that the search robot observe a certain time interval between downloading documents from the server, etc.

Some useful features of this file:
1. We tell the robot to load site pages at intervals of at least 20 seconds (useful on VPS/CloudServer to reduce the load):
User-agent: *
Crawl-delay: 20
2. Close the /admin directory from indexing by all robots:
User-agent: *
Disallow: /admin/
3. Deny all robots from indexing the entire site:
User-agent: *
Disallow: /
4. Allow indexing to one robot and prohibit all others:
User-agent: Yandex
Disallow: User-agent: *
Disallow: /


Was this answer helpful?

« Back