Robots.txt Tester
Test robots.txt rules, crawl access, and blocked resources for any URL.
Robots.txt Tester
Enter a URL, choose a user-agent, optionally paste custom robots.txt rules, and inspect crawl access plus blocked resources.
How to Use This Tool
Step 1
Enter the full URL you want to test.
Step 2
Choose the crawler user-agent you want to simulate.
Step 3
Optionally paste custom robots.txt rules or enable resource checks.
Step 4
Run the test and review the matched rule, crawl status, and linked resources.
Related SEO Tools
Frequently Asked Questions
What does the Robots.txt Tester do?
It checks whether a specific URL is allowed or blocked for a selected bot based on robots.txt rules. It also shows the matched rule and can inspect linked page resources.
Can I test a draft robots.txt file before publishing it?
Yes. Turn on the live editor and paste custom robots.txt rules to test a draft file against any URL before you deploy it.
Can I test different bots like Googlebot or GPTBot?
Yes. You can switch between common crawler user-agents such as Googlebot, Bingbot, GPTBot, ChatGPT-User, and others to see how access changes.
Does the tool check blocked CSS, JavaScript, and images?
Yes. If resource checking is enabled, the tool inspects linked page resources and evaluates whether they appear to be blocked by robots.txt rules.
Is the Robots.txt Tester safe to use?
Yes. You only provide a public URL or draft robots.txt content for testing. The tool analyzes crawl directives and does not modify the target website.