John Mueller Discusses Discovery And Refresh Metrics
Businesses and webmasters must always conduct an accurate SEO site analysis before doing anything else on their website. Google’s John Mueller discussed the new SEO analysis tool, Search Console’s Crawl Stats report, in more detail – specifically the discovery and refresh metrics.
A few weeks ago, the company updated Google Search Console’s Crawl Stats report, and it now provides users with new data that was not included in older reports.
The SEO site analysis provided by the new Crawl Stats report shows URLs categorised by response codes, allowing users to see which pages are being crawled by the search engine. It can also mark 404 URLs that Google is crawling but were not shown in the coverage report. Because of these new features, users can take advantage of the new update for their granular technical audits as it will not require them to access log files anymore.
On 27 November, a specific section of data called Crawl Purpose was brought up in the Google Search Central live stream edition.
Mueller was asked to provide more detailed information on the discovery and refresh metrics in Crawl Purpose. Specifically, he was asked about the difference between the percentage of “discovered” and “refreshed” links.
The person who asked the question said that that they are currently facing 84 per cent refreshed URLs. They wondered if Google crawls known URLs from their database 84 per cent of the time and only crawl their sitemaps and website, including links from their known URL database 16 per cent of the time.
The help document from the official Google Search Console only had short descriptions about discovery and refresh metrics. According to Google’s help document, “discovery” means that the URL requested was never crawled by Google in the past. On the other hand, “refresh” refers to a recrawl of a known webpage.
Mueller expanded on the help document’s information to answer the person’s question. He admitted that he was not sure which links will be categorised into discovery and refresh metrics, but he provided the SEO community of his own understanding of it.
According to Mueller, refreshed URLs are previously-crawled pages, and Google crawls them again to update the page’s information in the search index. On the other hand, discovered URLs are webpages that were crawled by Google for the first time. These links could be from new external or internal links that redirect to a website.
He also said that as Google crawls refreshed URLs, it involves content updates while actively searching for new links, which is done for discovering new content.
Moreover, the percentage of refreshed URLs must be higher than the discovered URLs. The only exception to this is when new sitemaps are uploaded, a website is newly-launched, or if a site has migrated to another domain.
If the SEO analysis report from the Crawl Stats tool shows that Google can’t crawl rapidly changing pages too often, businesses and website owners should ensure that they are included in a sitemap. This is because webpages that aren’t updated frequently will be crawled less often, although website owners can manually ping Google to recrawl their pages if desired.
Work With The SEO Experts
Here at Position1SEO, we provide you with relevant content that really answers the queries of your potential customers. This will make your website more authoritative, which develops trust among your potential users. It also increases your chances of getting more buying customers, as well as potential business partners that will help your company grow.
Our SEO experts ensure that your website receives organic traffic, meaning you get desirable results in the long run. We also make sure that you will not get Google penalties, which can be very harmful to your website’s SEO.
You can get a free in-depth website audit to make sure that your SEO is done right from the start. For queries about our services, dial 0141 404 7515 or email email@example.com today.