Should We Consider Crawl Budget When Managing An Affordable Optimization SEO Budget?
- 20 March, 2020
- Jason Ferry
- SEO
No business, no matter how large or small, has an unlimited amount of money to spend on SEO. Nor is there any guarantee that even with a limitless budget, your site would become and stay number one in the search engine rankings. There are a hundred and one metrics Google uses to evaluate a website, maybe more. The trick is to isolate the many options and prioritise them to develop an affordable optimization SEO strategy. This will help maximise your chances of being ranked highly in the search engine results.
That means prioritising some tasks above others. But where do you start? One aspect some website owners ask about is crawl budgets. But what are they and are they important for you or the search engine optimisation agency acting on your behalf? Here, we’ll explain what crawl budgets are and who needs to take note of them. First, in straightforward terms, the crawl budget is defined by Google as the number of URLs (or number of pages) its bot can and wants to crawl (i.e. acquire information on). Google sets a ‘crawl rate limit’ on this to make sure it’s not overloading your server with requests. But it also has its own priorities, known as ‘crawl demand’. So the crawl budget for a site is calculated by taking the crawl rate and crawl demand into account: how many URLs can Googlebot crawl in the time allotted to it? Now, in effect, exceeding the crawl budget probably won’t be an issue for most websites, because they have limited numbers of pages and Googlebot is pretty efficient at its job. But where it may affect you, and therefore your SEO optimization strategy, is if you run an especially large site – like a major e-commerce site with over 10,000 pages, say.
You or your SEO optimization service may also want to pay particular attention to crawl budget if you’re planning on adding a host of pages to your website and want them indexed and appearing in search results relatively quickly. And finally, what if the Googlebot crawler is slowed down when indexing your site because there are a host of redirects to other pages, or error messages because pages no longer exist, for instance? You’ll need to address this. So if any of these scenarios apply to you, how should you respond? One key step, which is applicable to those even on the tightest budgets and looking for affordable optimization SEO practices, is to work on improving your site’s page speed. Everyone benefits here: website users get a more responsive service while, for the largest sites, Googlebot can crawl more pages in a shorter time, thus making best use of your crawl budget. Next, if you’re adding valuable pages and want them indexed as soon as possible, you can use the Request Indexing function in Google’s Search Console. Simply enter the URLs you want to be indexed in the ‘Inspect and URL in domain.com’ field. This search will tell you whether Google’s already aware of a page and if not, you can use the REQUEST INDEXING button in the search results to speed up the discovery process.
Lastly, fix any redirects and error pages. You or your SEO agency can run a site audit to identify issues and rectify them. And it’s worth doing this even if your affordable optimization SEO budget doesn’t stretch to crawl budgets. Because you still need to make your users’ experience the best it can be, and broken links and error messages can be a real turn-off to prospective customers looking to buy from you.