logo

John Mueller Explains Why GSC Shows Crawl Errors Despite Pages Loading Normally

where to find the best website seo agency

Website SEO agencies use Google Search Console (GSC) to maintain, monitor, and troubleshoot their site’s presence on the search engine results pages. However, there have been some reports of errors showing up on GSC, even though web pages are loading normally on browsers. A website SEO expert recently asked Google’s John Mueller about the problem, hoping for a solution.

In response, Mueller discussed the possible reasons behind this issue. He also explained that it is not an issue with Googlebot but a problem on the server side.

GSC is a free tool from Google, helping website SEO agencies in many different ways. Although it is not necessary for someone to sign up for Search Console to have Googlebot index and crawl their website, using this free tool can give SEOs and site owners more insights on how to improve their website for better rankings.

The tool provides SEOs and webmasters with different tools and reports to determine websites linked to the user’s site; fix indexing problems and request Google to re-index their new or updated content; troubleshoot other problems for mobile usability, AMP, and other search features; and more.

But with the problem of the person who asked the question, they were unable to come up with any solutions even though they already tried verifying their web pages on GSC.

The SEO explained that they had a server error on some of their web pages. When they checked the pages in question, they found out that it was working fine in the browser. They also said they used the “validate option” several times, but Google still shows the web pages as errors. It’s as if there was something preventing Googlebot from validating the pages.

They have been waiting for a month now for Googlebot to index their pages, but it never did. Moreover, the GSC errors have impacted their organic impressions and clicks. It’s only now that the SEO turned to Mueller for advice.

Mueller replied that when GSC shows web pages as server errors while Googlebot crawls and indexes web pages, these issues really exist. He said that Google does not “invent” errors on web pages, and so Googlebot is not at fault in this scenario.

Afterwards, Mueller explained that some of these site issues are temporary. If it is indeed temporary, he reassured that Googlebot would crawl the website again in the future. And if the error is gone during the second crawl, then Googlebot can index the web page as it usually would.

Temporary Crawl Errors

Sometimes, such errors are indeed temporary. For instance, a server might be down due to maintenance. Maybe it is because of DNS issues, which take down a part of the Internet. Or it could be due to an overloaded server, which prevents Google from crawling the web page.

However, the SEO who asked the question might be experiencing a different situation.

If the web page loads normally but GSC cannot validate it, then there is a server issue. Users can also test their web page using other free tools that Google provides, such as the mobile-friendly tester or rich results tester. If the problem is still there, then it is confirmed that there is a GSC error.

Server Issues Can Cause Errors In GSC

Next, Mueller suggested that the SEO experiencing the problem could have issues with their server. He said that if the validate feature does not work and the problem happens regularly, he advised them to turn to their online hosting provider and deal with the issue together. They might be capable of diagnosing the problem or double-checking what’s going on to know the number of URLs it affected.

Mueller admitted that it could be tricky if, for instance, Googlebot crawls millions of pages from a website, and one hundred of them show an error. Google will probably think that it is irrelevant as errors will almost always appear somewhere.

On the other hand, if Googlebot crawls 200 web pages from a website, and it found out that there are a hundred pages with an error, then it is more concerning. The website owner would need to resolve the issue immediately or prevent it from happening altogether. Mueller added that it is a problem that Google cannot fix.

Diagnose Googlebot Crawl Errors

The SEO community has been using a certain diagnostic trick to determine if a problem is indeed a server-wide configuration issue, provided that there are other websites are using the same IP on the server.

First, one should identify the IP address that the website is on. Then, they can run the ISP using a reverse IP checker to see which other sites are hosted on the same IP address.

Once they have the list of websites, one can run them on Google tools like the rich results checker or the AMP checker. If the tool reports an error response to one domain or more, then it confirms that there is a server-wide error.

Every server comes with a server log, and users can start there to find out what causes the problem. These server logs will tell them the IP address of a visitor that triggered the error and the time and date it happened.

Another typical problem lies in the firewall setup, which might be too strict and could end up blocking Google. If users cannot access the server logs, they should get in touch with the web hosting customer support.

Get The Best Rankings With Our SEO Services

Here at Position1SEO, we can provide you with the services you need to secure the first page on Google Page 1! Our bespoke SEO packages can include handling your Google Search Console, improving your website load speed and performance, setting up Google Local, creating sitemaps, and many more.

You can decide which SEO tasks you want us to do for you, and we will use only the best white hat SEO tactics to improve your website’s SEO without risks!

For more information, you can get in touch with us by dialling 0141 404 7515 or emailing us at office@position1seo.co.uk.

where to find the best website seo agency
Author: Jason Ferry
Jason Ferry is an SEO specialist based in the United Kingdom. Taking pride in his many years of experience in the search engine optimisation industry, he has honed his skills in various digital marketing processes. From keyword marketing, website auditing, and link building campaigns, to social media monitoring, he is well-versed in all of them. Jason Ferry’s excellent skills in SEO, combined with his vast experience, makes him one of the best professionals to work with in the industry today. Not only does he guarantee outstanding output for everyone he works with, but also values a deep relationship with them as well.

Related  Posts

the benefits of an search engine optimisation audit
Search Engine Optimisation (SEO) is an essential ingredient for website success in the digital arena. The process of indexing and ranking websites on Search Engine Results Pages (SERP) requires constant evaluation. Therefore, it is vital to conduct an in-depth SEO audit to identify areas that need improvement. SEO audit is the process of evaluating a […]
search engine optimisation company
Search Engine Optimisation (SEO) is a crucial aspect of building a strong online presence. While many website owners focus on using keywords to rank higher on search engines, an SEO company can optimise your website in more ways than one. In this blog, we will explore the three areas an SEO company can improve upon […]
The importance of hiring a search engine optimisation consultant
Are you struggling to get your business noticed online? Is your website buried under a sea of search results? If your answer is yes, then it might be time to consider hiring an SEO consultant. With the ever-growing importance of online presence for businesses, it has become crucial to employ the right strategies to make […]