What is robots.txt?

Robots.txt is a text file located in the root of a website and is used to instruct search engine robots (also known as crawlers or bots) on how to navigate and index the content of the website.

Robots.txt is a text file located at the root of a website and is used to instruct search engine robots (also known as crawlers or bots) on how to navigate and index the content of the website. The file is part of the “Robots Exclusion Protocol” and is used to provide specific guidelines on what can and cannot be indexed.