Robots.txt Generator
Generate robots.txt files for your website. Control search engine crawling.
Presets
User-agent Rule
Generated robots.txt
User-agent: * Allow: /
About Robots.txt Generator
Robots.txt Generator creates a properly formatted robots.txt file for your website. Configure which search engine crawlers can access which paths — block specific bots, disallow private directories, and set your sitemap URL. Download the finished file ready to upload to your site root.
The robots.txt file is a standard web protocol that tells search engine crawlers which pages to index or avoid. It's required for controlling SEO crawl budget on large sites and for keeping admin, login, and private pages out of search results.
The generator includes presets for common use cases: allow all, block all, block specific directories, and block specific bots.
Features
- ✓Configure rules per user-agent or for all bots
- ✓Allow and Disallow path rules
- ✓Sitemap URL declaration
- ✓Common presets for typical use cases
- ✓Download the completed robots.txt
Common Use Cases
- →Blocking admin and login pages from search indexing
- →Preventing crawling of staging or development directories
- →Setting crawl budget rules for large sites
- →Adding sitemap location to help search engines find it