A robots.txt record may be a content record webmasters make to educate web robots on how to crawl pages on their site. There is a bunch of web standards that direct how search engine bots crawl the internet, get to and record content, and serve that content up on the search results.

In simple terms, the robots.txt file tells search engines which pages to look at, which ones to ignore and which ones to never show in the index. However, there are more advanced settings available too.

Nick Berns
Posted By Nick Berns

Nick Berns is a web developer & SEO specialist.