John Mueller Explains Why GSC Shows Crawl Errors Despite Pages Loading Normally
- 21 June, 2021
- Jason Ferry
- Website SEO
Website SEO agencies use Google Search Console (GSC) to maintain, monitor, and troubleshoot their site’s presence on the search engine results pages. However, there have been some reports of errors showing up on GSC, even though web pages are loading normally on browsers. A website SEO expert recently asked Google’s John Mueller about the problem, hoping for a solution.
In response, Mueller discussed the possible reasons behind this issue. He also explained that it is not an issue with Googlebot but a problem on the server side.
GSC is a free tool from Google, helping website SEO agencies in many different ways. Although it is not necessary for someone to sign up for Search Console to have Googlebot index and crawl their website, using this free tool can give SEOs and site owners more insights on how to improve their website for better rankings.
The tool provides SEOs and webmasters with different tools and reports to determine websites linked to the user’s site; fix indexing problems and request Google to re-index their new or updated content; troubleshoot other problems for mobile usability, AMP, and other search features; and more.
But with the problem of the person who asked the question, they were unable to come up with any solutions even though they already tried verifying their web pages on GSC.
The SEO explained that they had a server error on some of their web pages. When they checked the pages in question, they found out that it was working fine in the browser. They also said they used the “validate option” several times, but Google still shows the web pages as errors. It’s as if there was something preventing Googlebot from validating the pages.
They have been waiting for a month now for Googlebot to index their pages, but it never did. Moreover, the GSC errors have impacted their organic impressions and clicks. It’s only now that the SEO turned to Mueller for advice.
Mueller replied that when GSC shows web pages as server errors while Googlebot crawls and indexes web pages, these issues really exist. He said that Google does not “invent” errors on web pages, and so Googlebot is not at fault in this scenario.
Afterwards, Mueller explained that some of these site issues are temporary. If it is indeed temporary, he reassured that Googlebot would crawl the website again in the future. And if the error is gone during the second crawl, then Googlebot can index the web page as it usually would.
Temporary Crawl Errors
Sometimes, such errors are indeed temporary. For instance, a server might be down due to maintenance. Maybe it is because of DNS issues, which take down a part of the Internet. Or it could be due to an overloaded server, which prevents Google from crawling the web page.
However, the SEO who asked the question might be experiencing a different situation.
If the web page loads normally but GSC cannot validate it, then there is a server issue. Users can also test their web page using other free tools that Google provides, such as the mobile-friendly tester or rich results tester. If the problem is still there, then it is confirmed that there is a GSC error.
Server Issues Can Cause Errors In GSC
Next, Mueller suggested that the SEO experiencing the problem could have issues with their server. He said that if the validate feature does not work and the problem happens regularly, he advised them to turn to their online hosting provider and deal with the issue together. They might be capable of diagnosing the problem or double-checking what’s going on to know the number of URLs it affected.
Mueller admitted that it could be tricky if, for instance, Googlebot crawls millions of pages from a website, and one hundred of them show an error. Google will probably think that it is irrelevant as errors will almost always appear somewhere.
On the other hand, if Googlebot crawls 200 web pages from a website, and it found out that there are a hundred pages with an error, then it is more concerning. The website owner would need to resolve the issue immediately or prevent it from happening altogether. Mueller added that it is a problem that Google cannot fix.
Diagnose Googlebot Crawl Errors
The SEO community has been using a certain diagnostic trick to determine if a problem is indeed a server-wide configuration issue, provided that there are other websites are using the same IP on the server.
First, one should identify the IP address that the website is on. Then, they can run the ISP using a reverse IP checker to see which other sites are hosted on the same IP address.
Once they have the list of websites, one can run them on Google tools like the rich results checker or the AMP checker. If the tool reports an error response to one domain or more, then it confirms that there is a server-wide error.
Every server comes with a server log, and users can start there to find out what causes the problem. These server logs will tell them the IP address of a visitor that triggered the error and the time and date it happened.
Another typical problem lies in the firewall setup, which might be too strict and could end up blocking Google. If users cannot access the server logs, they should get in touch with the web hosting customer support.
Get The Best Rankings With Our SEO Services
Here at Position1SEO, we can provide you with the services you need to secure the first page on Google Page 1! Our bespoke SEO packages can include handling your Google Search Console, improving your website load speed and performance, setting up Google Local, creating sitemaps, and many more.
You can decide which SEO tasks you want us to do for you, and we will use only the best white hat SEO tactics to improve your website’s SEO without risks!
For more information, you can get in touch with us by dialling 0141 846 0114 or emailing us at office@position1seo.co.uk.