Robots.txt Generator
Create customized robots.txt files to control search engine crawler access
Configure Rules
Additional Settings
Generated robots.txt
User-agent: *
Common User-Agents
*All robots
GooglebotGoogle's crawler
BingbotBing's crawler
SlurpYahoo's crawler
Best Practices
- Place robots.txt in your website's root directory
- Use * to apply rules to all crawlers
- Test your robots.txt with Google Search Console
- Don't use robots.txt for sensitive data - it's publicly visible
- Include your sitemap URL to help crawlers discover your pages