Google Guide: How HTTP Status Codes Affect SEO
Expert SEO companies must ensure that their HTTP status codes are all correct as these can change how their website appears in the search results. A recent tweet implies that Google's Gary Illyes involved himself in creating the perfect guide that SEO companies can use as a reference if they are not sure how a specific status code affects their SEO.
The new document guide contains a lot of standard information that is familiar to many SEOs, but it does not hurt to refresh one's knowledge about status codes with the most recent information. It talks about the top 20 status codes that Google's crawlers encounter on the Internet. The guide also contains the most prominent network and DNS issues.
A server hosting a site generates HTTP status codes when a browser or crawler like Googlebot - requests a piece of content. However, there will be a 404 status code if the content is no longer available.
The status code's first number indicates its category. For example, all "2xx" codes mean successful crawling, while "3xx" codes are all redirects, and so on. Although there are 20 status codes discussed, below are the key details for each category.
HTTP 2xx (success)
Status codes that start with "2" mean that search engines can crawl the content and move it to their indexing pipeline. Google points out that having an HTTP 2xx status code does not mean that indexing is guaranteed. It simply indicates that Googlebot did not encounter any errors while it was crawling.
However, there is an exception to the 2xx status code: the 204 status code. This particular HTTP code means that the crawler successfully accessed the web page but did not find any content. Google may display a soft 404 status code in Search Console for web pages that have a 204 code.
HTTP 3xx (redirects)
Status codes that start with "3" mean redirects, but not all of them are equal. In terms of which URLs are considered canonical, the HTTP 301 status code sends stronger signals than 302, 303, or 307 codes.
On the other hand, a 304 status code tells Google that the content is the same as the last time the page was crawled. This code does not affect the indexing process, but it may cause the search engine to recalculate the signals for the URL.
In some situations, the redirect does not work. If this happens, Googlebot will follow up to 10 redirect hops before it stops trying. Search Console will display a redirect error in the website's Index Coverage report if the search engine does not receive the content within 10 hops.
HTTP 4xx (client errors)
Web pages that display a 4xx status code are not eligible for indexing in Google's search results. All 4xx errors are treated the same - except 429. The 4xx status codes tell Googlebot that the content does not exist. If the content did exist before, Google would remove its URL from its search index.
On the other hand, the 429 status code implies that Googlebot could not access the URL due to server overload. Therefore, Google's index will preserve those URLs.
HTTP 5xx (server errors)
5xx status codes tell Googlebot to temporarily slow down crawling. URLs that it previously indexed eventually get dropped if they serve a 5xx status code.
Network And DNS Errors
Network and DNS errors can negatively affect how Google displays URLs in the search results. Googlebot treats connection resets, network timeouts, and DNS errors similarly to 5xx server errors. So, if there are any issues, Google slows down its crawling, and network errors are a sign that the server cannot handle the serving load. Moreover, the search engine will remove unreachable URLs that it has already indexed from the index after several days. Search Console may also generate errors for each issue.
Debugging network errors
Network errors occur when Googlebot begins or is in the middle of crawling a link. Since the errors occur before the server responds and there is no status code that can hint at issues, it can be harder for Google's systems to diagnose these errors.
Here are a few steps to debug network errors:
- SEO companies should check their firewall settings, as there could be a broad blocking rule set.
- Next, they should check the network traffic. They can use tools like Wireshark and tcpdump to analyse and capture the TCP packets. These tools will also help them find anomalies that direct to a specific server module or network component.
If SEOs followed these two steps but did not find anything suspicious, they should contact their hosting company. The error may lie in the server components that handle network traffic. For instance, if the network interface is overloaded, it might drop packets that lead to reset connections or timeouts.
Debugging DNS errors
As for DNS errors, they often occur due to misconfiguration. SEOs can do the following steps to debug their DNS errors:
- SEOs can check their DNS records. Their CNAME and A records should point to the right hostname and IP addresses, respectively.
- One should also look at their name servers and see if they point to the correct IP addresses of their website.
- If SEOs made changes to their DNS configuration within the last 72 hours, they might have to wait for their changes to spread across the global DNS network.
- If the SEO runs their own DNS server, they should ensure that it is healthy and not overloaded.
Work With The Experts To Boost Your SEO!
Our team has the right people who can help you achieve the top position on Google Page 1 with zero penalties, thanks to our no-stone-unturned approach and white hat SEO techniques.
We will provide you with a free in-depth SEO audit to help you identify your site's current issues. Afterwards, we will talk about the problems highlighted in your SEO audit report, and you can decide the tasks that you want us to do for you. Some of the typical areas we work on include Google Search Console, website load speed and performance, link detox, and more.
For more information about our services, you can call us on 0141 404 7515 or email us at firstname.lastname@example.org.