jason@position1seo.com
0141 846 0114
Free SEO Audit
logo

Google May Be Planning to Reduce Web Page Crawl Rate

what are the best practices for crawling and indexing in SEO

As Google becomes more aware of the sustainability of crawling and indexing in SEO, it may reduce the number of times that SEO pages are crawled. John Mueller, Martin Splitt, and Gary Illyes from Google’s Search Relations department addressed this topic in the most recent episode of Search Off The Record.

The three talked about what to anticipate from Google in 2022. One of the topics they discussed was crawling and indexing, which became less frequent as SEO experts and website owners noticed last year.

Google’s goal this year is to conserve computing resources to make crawling more sustainable. Here’s what it means for the SEO community and their websites’ performance in the search results.

Sustainability of Crawling and Indexing in SEO

Googlebot crawls and indexes virtually, so most people think it does not affect the environment. However, Illyes pointed out that computing isn’t sustainable.

He said that even Bitcoin mining impacts the environment, and one can actually measure its effect, especially if the electricity comes from coal-fired or other less sustainable plants.

Illyes also said that Google has been carbon-free for well over a decade. However, the search engine company is still aiming to further decrease its environmental footprint. Crawling is one of those things that they can reduce to achieve this goal. In this case, Google can reduce unnecessary crawling for web pages that did not undergo any recent changes.

Google’s Plan to Make Crawling More Sustainable

Illyes explains that reducing the amount of crawling is one way to make it more sustainable. Googlebot has two types of web crawling: crawling to discover new content and crawling updated content. Google is planning to reduce crawling that refreshes updated content.

This is how Google refreshes updated content: Googlebot first visits a URL, crawls it, then goes back to that URL after some time to re-crawl and see if the publisher made changes. That is known as a refresh crawl. Every time they revisit a URL, they will always do a refresh crawl.

The question is, how often does Google revisit a single URL?

Illyes then mentioned that specific types of websites need their pages re-crawled more compared to others. For instance, a news website’s homepage will almost always update; therefore, it requires a lot of refresh crawls.

However, the same news website won’t modify its About page as often, so Google doesn’t need to perform refresh crawls on those kinds of pages. Illyes also admitted that they often can’t estimate how frequently they perform refresh crawls.

He thinks that revisiting the same page repeatedly is a waste of time and resources. Sometimes, they revisit 404 pages for no good reason. So, there is plenty of room for improvement in terms of reducing Google’s footprint.

The search engine giant has yet to confirm if they would reduce their refresh crawls, but if they would ever include that in their plan, it could have several effects on websites.

The Effect of Crawl Rate Reduction on Websites

Mueller then asked Illyes about the idea that having a high crawl rate is a positive SEO signal. Some SEOs believe that it is good if Google crawls their website frequently, even if they rarely update their content. According to Illyes, this is a complete misconception, as web pages do not receive ranking bonuses if Google frequently crawls them.

Mueller then said that SEOs and site owners should not force re-crawls for their existing content if there haven’t been any changes since doing so does not give any ranking bonuses.

Anticipating a Reduction in Crawl Rate

Google is unclear about whether or not it will reduce its refresh crawls, but the team is currently considering it. If Google implements this, it will not harm site rankings. After all, more crawling does not guarantee higher positions in the search results.

In addition, the goal is to figure out which pages require refresh crawls and which don’t. That implies that publishers who update their pages will most likely continue to be refreshed in the search results.

Tips on Improving Indexation

Most site owners don’t worry about their crawl budget once a site is live or has advanced past a certain age. As long as they keep adding new blog articles to their website, it should rank in Google’s search results.

But after a certain point, they may lose search rankings due to the poor technical structure of the site, a crawling problem, thin content, or a new algorithm update. Therefore, one must limit their crawl budget to stay competitive with hundreds of trillions of web pages in Google’s index. Here are a few quick tips:

1. Use Google Search Console to track crawl status

Checking crawl status regularly, such as once every 30 to 60 days or so, is critical in detecting problems that affect the site’s overall marketing performance. It’s the first step in SEO; everything else is meaningless without it. One can remove access to any web page directly via Search Console if it has a 404 error or has been temporarily redirected.

2. Create mobile-friendly web pages

SEOs must optimise their pages to show mobile-friendly versions on the mobile index. Here are some good technical tweaks to try:

  • insert the viewpoint meta tag in the content
  • implement responsive web design
  • tag web pages with the AMP cache
  • optimise and compress images to load them faster
  • reduce the size of on-page UI elements
  • minify on-page resources, like JS and CSS

3. Update content regularly

If a website produces fresh content regularly, Google will crawl its pages more frequently. This is especially advantageous for publishers who need fresh articles updated and indexed regularly.

If a site is continuously improving and producing fresh articles, it therefore must be crawled more often to reach its target audience.

Position1SEO Helps Improve Your Site Indexation

Position1SEO is a UK-based SEO company that provides fresh articles and updates to your existing content. We help optimise your web pages to boost your chances of getting indexed and ranking higher in Google’s search engine. With our help, you can boost your visibility online and increase traffic to your website!

Position1SEO helps you improve website performance and accessibility, conduct sitemap installations and positive link building, and more. We also provide a thorough SEO audit and free phone consultation. Contact us today for more information on how to take your business’ SEO to the next level!

what are the best practices for crawling and indexing in SEO min
Author: Jason Ferry
Jason Ferry is an SEO specialist based in the United Kingdom. Taking pride in his many years of experience in the search engine optimisation industry, he has honed his skills in various digital marketing processes. From keyword marketing, website auditing, and link building campaigns, to social media monitoring, he is well-versed in all of them. Jason Ferry’s excellent skills in SEO, combined with his vast experience, makes him one of the best professionals to work with in the industry today. Not only does he guarantee outstanding output for everyone he works with, but also values a deep relationship with them as well.

Related  Posts

seo audit agency blog04
In the ever-evolving digital landscape, search engine optimisation (SEO) remains a critical component for enhanced online visibility and business growth. Conducting comprehensive SEO audits is essential to identify opportunities for improvement and to devise effective strategies. Choosing the right SEO audit agency is crucial in ensuring that your website is optimally set up for search […]
seo page audit blog
In the rapidly evolving digital landscape, businesses need to ensure that their websites remain relevant and competitive. One crucial way to achieve this is through regular SEO page audits. These audits help identify areas of improvement, enhance user experience, and maintain high search engine rankings. In this blog, we'll explore why regular SEO page audits […]
technical seo specialist blog02
In today's digital landscape, having a robust online presence is crucial for any business aiming to succeed. While many focus on engaging content and eye-catching design, the underlying technical structure of a website is often overlooked. This is where a Technical SEO Specialist comes into play, ensuring that your website not only appeals to visitors […]