arrow_back Back to Tools

info What is robots.txt?

The robots.txt file tells search engine crawlers which URLs they can access on your site. Common uses include:

settings

Configuration

User Agent #1

info Use * for all bots, or specify: Googlebot, Bingbot, etc.
info Paths that bots should NOT crawl
info Paths that bots CAN crawl (overrides disallow)
info Delay between requests in seconds (optional)
info Full URLs to your sitemap files