What is Crawl Budget?
Crawl budget is how much a search engine bot (typically Googlebot) can and will ‘crawl’ a site and its associated URLs to update its ranking. How a crawl budget is decided can be complicated. Still, it is primarily determined by two factors: crawl demand and crawl rate limit.
More About Crawl Budget
To better understand crawl budget, it’s helpful to know more about crawl demand and crawl rate limit.
Crawl demand is pretty straightforward. The more popular a site is, the higher the need (and demand) for it to be crawled. Googlebot, in particular, will want to ensure it has up-to-date records on a frequently visited site.
For less popular URLs, crawl demand focuses on maintaining freshness. Google doesn’t want stale and outdated content clogging up top rankings in the SERPs.
Crawl rate limit is the maximum number of parallel connections that a Googlebot can use to crawl a site and the waiting time required between each connection. Site owners can set limits on this, but server capacity can also impact this number.
A fast-responding web server will have a higher crawl limit. Alternatively, Googlebots will reduce their crawling rate to avoid interfering with a weaker site’s functionality.