Control how search engines crawl your site. Create a properly formatted robots.txt file in seconds with our visual builder to check your crawl budget.
Upload this file to the root folder of your website:
yourdomain.com/robots.txt
Understanding the commands used in robots.txt.
User-agent: *
"User-agent" specifies which bot the rule applies to. The asterisk (*) is a wildcard meaning "all robots".
Disallow: /private/
"Disallow" tells robots NOT to visit a page or directory. "Disallow: /" blocks the entire site.
Allow: /private/public-page
"Allow" is used to grant access to a specific file or folder inside a disallowed directory.
Sitemap: https://...
Tells crawlers exactly where to find your XML sitemap, helping them discover all your pages efficiently.
Syntax errors can accidentally de-index your site.
Stop AI bots from scraping your content for training.
Stop bad bots from overloading your server.
Guide Googlebot to your most important pages.
https://yourdomain.com/robots.txt. If you put it in a subdirectory (e.g., /blog/robots.txt), search engines will ignoring it.
Generate your file in seconds and ensure search engines crawl your site correctly.
build Generate Robots.txt