Mueller Discusses the Two Types of Crawling
When it comes to crawling and indexing in SEO, Google employs two crawling techniques: finding fresh content and refreshing existing content. Google’s Search Advocate John Mueller discussed these SEO matters in a Google Search Central SEO office-hours hangout on 7 January.
One SEO expert joined the live stream to ask Mueller several questions, one of which was about how frequently Googlebot crawls their website. They shared that when they published more, Googlebot crawled their site daily. Now that they publish fewer articles, Google doesn’t crawl the website as often.
The person asked Mueller if this was normal, probably because they thought a reduction in crawl frequency was a bad sign. However, Mueller assured them that it was fine and explained Googlebot’s two types of crawling.
Crawling and Indexing in SEO Explained
Before discussing the two types of crawling, one must first understand how Google crawls and indexes. Google Search describes the Internet as a vast library with many books that lack a central filing system. Google employs web crawlers to discover freely accessible Internet pages, examine websites, and follow links on them – just like how users would while browsing the web. These crawlers move from link to link, gathering information about those sites and returning it to Google’s servers.
When spiders come across a webpage, Google’s systems render the page’s content, just as a browser would. They keep track of important signals, like website freshness and keywords, and keep track of them in the Search index.
The Google Search index has hundreds of billions of web pages, 100 million gigabytes in size. It’s similar to a book’s index; they have an entry on every analysed page for each term. When they index a web page, they add it to the entries for all its words.
With the Knowledge Graph, Google improves their keyword matching to understand people, places, and things that users care about. They also arrange information about web pages and other sorts of data. Google Search can now assist users in searching text from millions of books from major libraries, determining travel times from their local public transportation agency, or navigating data from open sources like the World Bank.
Googlebot’s Two Types of Crawling
One can determine how often Googlebot crawls their website by looking at their Search Console’s report. They may also find out about the times when Google crawls their site more frequently than other days.
When asked about the report’s findings, Mueller confirmed that fluctuations are common. He said that it’s not about crawling a website but about crawling individual pages. He also explained the two types of crawling techniques they use.
The first crawling technique is a “discovery crawl”, where Google looks for new web pages on a site. The second one is a “refresh crawl” to update existing content that they have already crawled before.
Moreover, the crawl frequency of a whole website and individual web pages can vary. If a homepage is updated more frequently than other pages, the site owner will see more Googlebot activity on that specific page. Mueller said that Google would probably use “refresh crawl” on the homepage every few hours or once a day.
If Google finds new links on the home page, they will crawl those pages using “discovery crawl”. Therefore, site owners and SEO experts usually notice a combination of refresh and discovery crawling when Google crawls the site. And they will usually see a little bit of crawling happening daily.
On the other hand, if Google thinks that the website’s pages do not change that often, the system algorithm might think that they do not need to crawl them all the time.
Some websites are more likely to be visited than others. A news website, which is always updated, will be crawled more frequently than a site that publishes only one article every month. Googlebot can recognise these patterns and change its crawling speed to match them.
Mueller also replied that crawling a news website more frequently than non-news outlets does not indicate ranking or excellence. Googlebot can distinguish which websites need to be crawled more.
Therefore, SEOs should not be alarmed if Googlebot crawls their websites more or less often. They also do not need to worry if Googlebot has recently crawled their websites, but the updates to their existing content do not show in the search results. That might be because Google crawled the website to look for fresh content and not refresh existing information.
If site owners and publishers rarely change their published content, Googlebot might do a “discovery crawl” on the web pages. And this crawling technique is not necessarily associated with content quality.
Asking Google to Re-Crawl Web Pages
Search Central has provided the SEO community with various methods to ask Google to re-index their web pages if they’ve recently added to or modified their site. However, SEOs cannot make such a request for URLs that they don’t manage.
Below are some of the general guidelines that one should be aware of when asking Google to re-index their web pages:
- It might take anywhere from a couple of days to a few weeks for Google to crawl web pages. One should be patient and monitor progress by using the URL Inspection tool or looking at the Index Status report in Search Console.
- All methods that Google Search Central provides have the same response time.
- Submitting individual URLs has a quota.
- Requesting Google to re-crawl the same web page or sitemap multiple times does not speed up the process.
Prepare Your Web Pages for Search Engine Indexing
Position1SEO is a reputable SEO agency based in the UK, providing a wide range of services for all business niches. Our team of experts will keep your web pages fully optimised in order to help Google crawl and index your content.
We offer free in-depth SEO audits to highlight areas of improvement for your website, as well as a phone consultation to provide you with detailed investment proposals, where you choose which tasks you want us to do for you. You can choose any keywords you want at any competition level during this stage.
We use white hat SEO tactics, and you can rest assured that no stone is left unturned as we get your site ranking at the top of the search engine results pages. We also offer transparency with every project we work on – so you know exactly what we’re doing for you and why it’s essential.
Don’t hesitate to contact us for more information about our SEO services!