Cơ sở kiến thức
Semrush Toolkits
SEO
Site Audit
How to Optimize your Site Audit Crawl Speed

How to Optimize your Site Audit Crawl Speed

When you audit your site with Site Audit, we send a crawler to inspect the pages you instruct it to in the tool’s configuration. 

However, it is possible to run into crawling-related issues such as:

  • The crawler crawls the website too fast and temporarily crashes a site or slows it down for users.
  • The crawler crawls the website too fast and as a result the pages don’t load properly. Then, Semrush presents “false positives,” or reports issues in your audit results that would not have been reported if the Audit crawled at a slower rate and the site did not crash as a result. 
  • The crawler is taking too long to crawl the site and you are waiting too long to get your results.

Speed up Site Audit

If your Site Audit is crawling too slow, you can speed it up in the crawler settings step of the configuration window. 

To speed up your Site Audit, choose “minimum delay between pages” to crawl the site as fast as possible. 

Site Audit crawler settings

This action will make sure our bot ignores any crawl-delay instructions in the site’s robots.txt and crawls at the fastest possible speed.

If you still are not satisfied with the speed of your crawl, the other step you can take is to make your pages load faster. When pages load slowly it slows down the crawl. 

Now, keep in mind that if a bot is crawling your site at a very high speed, it could cause users on the website to experience slower loading times as a result. 

So, in certain cases it would be wise to actually slow down your crawl, rather than speed it up. 

Slow down Site Audit

If your Site Audit is crawling too fast it has the potential to temporarily crash a site or slow it down for users and present false positive results. Slow down the crawl speed to ensure that your results are representative of a user’s experience on the site. 

To slow down your audit, you have a few options in your configuration settings:

  • Tell Semrush to crawl 1 URL per 2 seconds. This will ensure that the crawl doesn’t overwhelm your site’s server with an abundance of activity in a short amount of time. 
  • Set a custom crawl delay in your website’s robots.txt file. If you choose the “respect robots.txt” option in the configuration steps, it will tell our bot to only crawl the pages at the speed you instruct it to. 

    For example, if you add “Crawl-delay: 5” to your robots.txt file, it will tell the bot to wait 5 seconds before accessing the site to view another page. Again, this will ensure that the crawl goes at a pace that will not crash or slow down the website (see below). 
    User-agent: SemrushBot
    Crawl-delay: 5
    
    User-agent: SiteAuditBot
    Crawl-delay: 5

     

Please note: the maximum crawl delay we can apply is 30, so anything beyond 30 will be equal to 30 for our bot

If you’re still facing issues running your Site Audit, try our troubleshooting guide or contact our support team

Các câu hỏi thường gặp Hiển thị thêm
Hướng dẫn sử dụng Hiển thị thêm
Quy trình làm việc Hiển thị thêm