No, it isn’t a ranking factor. Consequently an increased crawl rate will not necessarily lead to better positions in Search results (Google uses hundreds of signals to rank the results but crawl rate isn’t one of them). But crawling is essential to get indexed and on Google and a high speed website tells Google you’re on a healthy server and running a healthy website so Google bots can get more content over the same number of connections. As Google says: “Efficient crawling of a website helps with its indexing in Google Search.”
Please note there are not only the website’s pages that matters when crawling your website. Any URL, alternate URLs like AMP pages and hreflang, embedded content such as CSS and JavaScript will consume from crawling dedicated by Google to your website. Google recently introduced the “crawl budget” concept as the number of URLs Google bot can and wants to crawl, based on:
- Popularity: URLs that are more popular on the Internet tend to be crawled more often to keep them fresher in Google’s index
- Staleness: Google systems attempt to prevent URLs from becoming stale in the index
A site with fewer than a few thousand URLs will be crawled efficiently most of the times, and its owner has nothing to worry about crawling. Bigger sites face the problem of prioritizing what to crawl, when, and how much resource the servers hosting those websites can allocate to crawling. Good to know, new pages tend to be crawled the same day they’re published so again, their owners shouldn’t worry about crawling.
But sometimes Google bots won’t go over the Crawl Rate Limit which limits the maximum fetching rate for a given site – the number of simultaneous parallel connections bots may use to crawl the website, as well as the time they have to wait between the fetches. From Google again: “if the site responds really quickly for a while, the limit goes up, meaning more connections can be used to crawl. If the site slows down or responds with server errors, the limit goes down and Googlebot crawls less”. In the latter case the websites owners may perceive crawling as problematic.
Other factors influencing crawling are:
- Faceted navigation and session identifiers
- On-site duplicate content
- Soft error pages
- Hacked pages
- Infinite spaces and proxies
- Low quality and spam content