Skip to main content Skip to docs navigation

Robots.txt

Robots.txt is the filename used for implementing the Robots Exclusion Protocol, a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit. The standard, developed in 1994, relies on voluntary compliance.

On this page

Updated on April 20, 2024 by Datarist.