Google wants to formalize the robots exclusion protocol specification

Google has announced in a blog post that they want to formalize the robots exclusion protocol specification.

Robots.txt specification

“For 25 years, the Robots Exclusion Protocol (REP) has been one of the most basic and critical components of the web. It allows website owners to exclude automated clients, for example web crawlers, from accessing their sites – either partially or completely. […]

The proposed REP draft reflects over 20 years of real world experience of relying on robots.txt rules, used both by Googlebot and other major crawlers, as well as about half a billion websites that rely on REP.”

How to check the robots.txt file of your website

Among many other things, the website audit tool in SEOprofiler checks the robots.txt file of your website. If there is an issue with the robots.txt file, the website audit tool will reveal it:

Check your robots.txt file

Tom Cassy

Tom Cassy is the CEO of SEOprofiler. He blogs about search engine optimization and website marketing topics at “”.