Robots.txt File Generator
Optional: For documentation purposes
Your website's base URL (without trailing slash)
Full URL to your XML sitemap
Seconds between requests (0-10, 0 = no delay)
Hold Ctrl/Cmd to select multiple search engine bots
Optional comments that will appear at the top of the file
Default Crawler Rules
Allow All
Allow all search engines to crawl your entire website
Block Private Areas
Allow crawling but block private/admin areas
Block All
Block all search engines from crawling your site
Custom Rules
Create your own custom crawling rules
Path Rules
| Path | Rule | User-Agent | Actions |
|---|
Add specific paths to allow or block. Use * for wildcards and $ to indicate end of URL.
Validation Results
Syntax
Valid
URLs
Valid
Rules
Complete
SEO
Check sitemap
Robots.txt Preview
Search Engine Control
Control how search engine crawlers access and index your website content
Security & Privacy
Protect private areas, admin panels, and sensitive data from being indexed
Crawl Optimization
Optimize crawl budget and server resources with smart crawl delays
Validation & Testing
Validate your robots.txt syntax and test with Google's testing tool