Robots.txt Tester
Validate and test your robots.txt file. Check syntax errors, analyze directives, and verify which URLs are blocked or allowed for crawlers.
Enter a website URL to fetch and test its robots.txt file
Validate Your Robots.txt
Your robots.txt file controls how search engines crawl your website. Our tool helps you validate syntax, identify errors, and ensure your directives work as intended.
Fetch robots.txt from any URL or paste your own content for validation. Get detailed analysis of user-agents, allow/disallow rules, and sitemaps.
Syntax Validation
Check for syntax errors and invalid directives in your robots.txt.
Directive Analysis
See all user-agents, allow/disallow rules, and crawl delays.
Sitemap Detection
Identify all sitemaps referenced in your robots.txt.
Free Forever
No signup required. Part of our free SEO toolkit for everyone.
How It Works
Test your robots.txt in seconds
Enter URL
Enter any website URL to fetch its robots.txt file.
Analyze File
Our tool parses and validates all directives in the robots.txt.
View Results
See errors, warnings, user-agents, rules, and sitemaps.
Powerful Features
Comprehensive robots.txt testing
Syntax Validation
Identify invalid syntax and formatting errors.
User-Agent Analysis
See all user-agents and their specific rules.
Disallow Rules
View all blocked paths for each user-agent.
Allow Rules
See explicitly allowed paths and overrides.
Sitemap Detection
Find all sitemaps referenced in robots.txt.
Error Detection
Get warnings about potential issues and mistakes.
Frequently Asked Questions
Common questions about robots.txt testing