Persistent 5xx Result In Slower Crawling But Won’t Say How Many URLs Required
Google’s John Mueller said that persistent 5xx error responses to Google’s requests by your server would result in Google slowing its crawl of your web site. John wouldn’t say if there is a specific percentage of your site’s URLs that are required or a specific number. But he just said if Google sees persistent 5xx errors, Google will slow how it crawls your site.
He said this in a Twitter thread. “Persistent 5xx’s would cause us to crawl slower than usual,” he said. He added that “Persistent errors can mask real errors though, so I’d clean that up.”
There is no limit to the number, he said “There is no limit, we’d just crawl slower.” But crawling slower might not be a bad thing he said “For some sites, crawling slowly is also fine. For others (eg, lots of updates), you might not be so happy with that. In short, …. it depends.”
Here are those tweets:
Persistent 5xx’s would cause us to crawl slower than usual, but if it’s really just 25k URLs that we crawl, it’s probably no big deal. Persistent errors can mask real errors though, so I’d clean that up.
— ???? John ???? (@JohnMu) October 24, 2019
There is no limit, we’d just crawl slower. For some sites, crawling slowly is also fine. For others (eg, lots of updates), you might not be so happy with that. In short, …. it depends ????
— ???? John ???? (@JohnMu) October 24, 2019
I say, if you see 5xx errors you probably want to fix them.
Forum discussion at Twitter.