What is Robot.txt? and what are its significance in SEO practices?
A robots . txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google.
txt tells search engine spiders not to crawl specific pages on your website. You can check how many pages you have indexed in the Google Search Console. If the number matches the number of pages that you want indexed, you don’t need to bother with a Robots . txt file.