A Robots.txt file is a textual document that provides guidelines to search engine robots, instructing them on the specific web pages they are allowed or not to crawl. These instructions are conveyed through “allow” or “disallow” directives, which control the behavior of specific bots or all bots.