Adding a crawl delay to robots.txt

Various search engines and bots will scan your website pages to index them.

This can consume server resources quickly if you have lots of pages.

We recommend adding a crawl to help throttle those bots.

User-agent: SeznamBot
User-agent: DotBot
User-agent: bingbot
User-agent: YandexBot
Crawl-Delay: 30

Keywords: bots, crawling, indexing

Click Here for how to create robots.txt file.

About the author

Level 1 Support

Level 1 Technical Support