Google and WordPress Robots.txt Handling Is Being Looked Into

One of the takeaways from the Webmaster Conference was that if Google tries to access your robots.txt file is unreachable but it does exist then Google won’t crawl your site. Google said about 26% of the time GoogleBot cannot reach a robots.txt file. WordPress might make changes in order to reduce this error rate.

Here is one of many tweets about this:

Now, with WordPress, Joost de Valk from Yoast said ” for sites you can’t reach the robots.txt for, is a subset of those WordPress sites? A larger subset than you’d normally expect maybe?” He added that he is “trying to figure out if we should be safer in how WordPress generates robots.txt files.”

Gary Illyes from Google said he believes WordPress is generally okay with this issue but he will look into it further to see if WordPress can make some small changes here.

I love this dialog between Google and Yoast (which is very tied to WordPress).

Forum discussion at Twitter.

Update: I upset Gary again, and for the record, the latest intel was the percentage of robots.txt Google cannot reach.

You might also like
Leave A Reply

Your email address will not be published.