Search Console: New Update On Crawl Stats Report
Google Search Console’s crawl stats report is a feature that can be beneficial for SEO optimisation. Search Advocate Daniel Waisberg discussed this major update to help webmasters and businesses improve their strategies for SEO optimisation on Google.
The crawl stats report had a huge update several months ago, particularly in the area of determining how well Googlebot can crawl a certain website. When Googlebot crawls a website effectively, it immediately indexes new content in the results pages and helps the search engine notice changes made to published content.
Google first starts with a long list of links from previous sitemaps and crawls from the website owner. Then, Google’s web crawlers will visit all the links, reading the information in them and following the URLs on those webpages. They will then revisit those webpages to see if something changed in the content, and also crawl new pages if there are any.
During the crawling process, Googlebot needs to decide which link or content to prioritise and crawl while still ensuring that the site can handle Google’s server requests. Webpages that were crawled successfully will be processed and passed to Google indexing, preparing the content to be shown in the search results pages.
Since the search engine does not want to overload servers, the frequency of crawls is based on three important things: crawl rate, crawl demand, and crawl budget.
The crawl rate refers to the maximum number of requests Googlebot makes when crawling a site. Secondly, the crawl demand is how desirable the content is for Google to crawl. Lastly, the crawl budget refers to the number of links that Google wants to crawl.
The crawl stats report in Search Console greatly helps with SEO optimisation for Google as it allows SEOs to understand and optimise Googlebot crawling. This feature provides them with detailed reports and statistics about Google’s crawling behaviour, such as how many times it crawls a website and what the responses were.
Waisberg explained that the report is suitable for huge websites, but not as useful if the website has less than a thousand pages. An SEO expert can get answers and detailed information about their website’s general availability, average page response for a crawl request, and the number of requests Google made to their website within the last 90 days.
Businesses and webmasters can access the crawl stats report on the Settings page of the Search Console. Upon clicking on the crawl stats report, they will find a summary page that includes a crawl request breakdown, a crawling trends chart, and host status details.
First, the crawl request breakdown helps SEOs and webmasters understand what Googlebot found in their sites, with four available breakdowns: crawl response, crawl file type, crawl purpose, and Googlebot type.
Next, the crawling trends chart shows three metrics: the total download size from a website during the crawling process, the total crawl requests for links on the website – successful or not, and the average page response time for crawl requests to retrieve webpage content.
Lastly, the host status data allows an SEO to check their website’s general availability within 90 days. It shows three categories: server connectivity, robots.txt fetch, and DNS resolution.
Work With A Professional SEO Agency
Using Google Search Console can be a bit confusing especially if it’s your first time. Here at Position1SEO, we can help manage your website by using several Google tools that are useful for monitoring and tracking your SEO progress.
Our team is knowledgeable at creating up-to-date and efficient SEO optimisation strategies that significantly boost your rankings. We work in three simple stages: first, an in-depth SEO audit for free. Second, determining the areas of improvement on your site as well as conducting keyword research. And lastly, we focus on User Engagement work to help you retain your online following. If interested, do not hesitate to call us on 0141 404 7515 or write to us at email@example.com.