What is the robots.txt file?

What is the robots.txt file?

A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google.

  • Robots.txt tells search engine spiders not to crawl specific pages on your website. You can check how many pages you have indexed in the Google Search Console.