See also
- Robots.txt on Wikipedia
Robots.txt is the filename used for implementing the Robots Exclusion Protocol, a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit. The standard, developed in 1994, relies on voluntary compliance.
Updated on April 20, 2024 by Datarist.