Garry Illyes from Google posted last night a new Q/A in the Google Webmaster Central Blog. It says that if you dissolve a URL in your robots.txt file then it would not affect your crawl budget.

Gary-Illyes

The blog, which was started on 16th January 2017,  addresses queries related to Crawl Budget. Gary Illyes in the blog has talked about- What crawl budget is? How do crawl rate limits work? What crawl demand is and what factors impact a site’s crawl budget?

Gary earlier in the blog had discussed that for most sites ‘crawl budget is not something to worry about’, for really large sites it can be something to consider looking at.

Earlier Gary Illyes had written in the blog-

  • Crawl Rate Limit helps Google not to crawl a sites’ page too fast or too much so as to hurt its server.
  • Crawl Demand is defined as how much Google wants to crawl your pages. This depends on two factors, that is, the popularity of your site and the staleness of your content.
  • Factors that affect crawl budget are- faceted navigation, on-site duplicate content, soft error pages, hacked pages, low quality, and spam content.