How Sitemap Files Are Processed By Googlebot
- 17 December, 2019
- Jason Ferry
- Googlebot
In a recent thread on Reddit, one user asked if having multiple sitemap files for client websites was redundant. John Mueller of Google answered by saying that the quantity of files doesn’t matter for search engine optimisation, and further elaborated by explaining how such information is processed by Google. According to him, Googlebot receives all of a website’s sitemap files as a single, large set of data. Based on this description, Google never really knows the number of sitemap files in a website because it reads everything as one big list of URLs.
Mueller added that a sitemap file’s last-modification date is something that SEO experts and website owners should focus on instead, and that they can be valuable signals to Google if properly used. He adds that using the generation time of a sitemap as the last-modification date is not beneficial, as this wasn’t when its main content was altered. Website owners and SEOs must also watch out for dynamic last-modification dates returned by servers for every URL, as these are also not helpful.
Mueller advised website owners to provide correct last-modification dates to Googlebot and make sure that every URL in the sitemap files share the same date.
This blog post used information from https://www.searchenginejournal.com/john-mueller-googlebot-receives-sitemaps-in-the-form-of-an-energy-drink/340643/. Visit the provided link to get complete details.
Working with an excellent provider of SEO services UK based will significantly benefit your business’s online strategy. Visit our homepage at Position1SEO today to know more about our offered packages.