In the SEO industry, factors that relate to how Google and other search engines crawl websites are often referred to as “crawlability”. These are usually on-site elements that affect how Google interprets things like content, this could be simple things like page structure or more technical factors such as robots.txt or XML sitemaps.
However, external to a website, search engines have their own set of rules in place that dictate things like how often their spiders will visit a site and how many pages they will crawl.
Crawl limit has been defined by Google as the limitation they place on crawling websites and combines two well-known factors; crawl rate and crawl demand.
- Crawl Rate Limit aims to prevent Google from crawling pages too often and too fast which could cause performance issues for normal visitors
- Crawl Demand refers to Google’s need to crawl popular websites. This is determined by how frequent changes are made to the site and its popularity
Crawl Budget is the culmination of these two factors which is defined by Google as “the number of URLs Googlebot can and wants to crawl”. Crawl Budget should always be a consideration when optimising a website as well as understanding how on-going changes affect the behaviour of search engine bots.