Google: you have to optimize your pages if blocked pages outrank your regular pages

On Twitter, Google’s John Mueller said that you have to work on your website if pages that have been excluded by your robots.txt file outrank the regular pages of your website in Google’s search results.

robots.txt on Google

What is robots.txt?

A robots.txt file is a simple text file in the root directory of a website (www.example.com/robots.txt). The robots.txt file enables you to inform indexing bots about which areas of your website should not be processed or scanned. Not all robots cooperate with the standard.

If you block a page with robots.txt, Google will still index the URL of the page, but not the content of the page.

What’s the problem with robots.txt?

On Twitter, a webmaster complained that pages that he excluded in his robots.txt file still showed up in Google’s search results:

Google’s John Mueller said that this means that the website owner has to work on the web pages:

Optimize your web pages

If blocked pages have higher rankings on Google than you regular pages then you have to optimize the content of your pages and you have to improve the links that point to your pages. The tools in SEOprofiler help you with that:

Try SEOprofiler now!

Tom Cassy

Tom Cassy is the CEO of SEOprofiler. He blogs about search engine optimization and website marketing topics at “http://blog.seoprofiler.com”.