Google listing pages excluded by robots.txt

Matt Cutts has recently released has recently explained whilst Google still indexes pages which have been specified as excluded via the robots.txt file.

It turns out that Google will link to the pages but will not actually crawl the content of the page and as such they will display without a title and/or snippet.

With title and snippet

Without title and snippet

To remove a page from Google’s index completely you can do one of two things, you can either add a noindex meta tag to the page itself or you can use Google’s removal request tool. If you use the removal request tool you must be certain that you want it to be removed as it can be tricky to get it put back into the index at a later time.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.