Robots.txt Tester
Inspect how cocodillaa.com controls crawler access, blocked paths, sitemap references and AI crawler rules.
Preview
Score: 84
- At least one user-agent has Disallow: / which blocks the entire site.
- robots.txt does not reference any sitemap URLs.
Get your full report + exact fixes
See what’s hurting your SEO and how to fix it step by step.
- Full breakdown
- Actionable fixes
- Prioritized next steps
Robots.txt Status
Robots.txt Status
Present
Score
84
/100
· Strong
View Full Robots.txt
Robots.txt Content Preview
# Allow AI search and agent use User-agent: OAI-SearchBot User-agent: ChatGPT-User User-agent: PerplexityBot User-agent: FirecrawlAgent User-agent: AndiBot User-agent: ExaBot User-agent: PhindBot User-agent: YouBot Allow: / # Disallow AI training data collection User-agent: GPTBot User-agent: CCBot User-agent: Google-Extended Disallow: / # Allow traditional search indexing User-agent: Googlebot User-agent: Bingbot Disallow: /downloads/ Allow: / User-agent: Mediapartners-Google Allow: /*/download User-agent: * Disallow: /images/thumbnail/ Disallow: /wp-admin/ Disallow: /ajax-data Disallow: /downloads/ Allow: /wp-admin/admin-ajax.php
User-agent Rules
| User-agent(s) | Allowed paths | Disallowed paths |
|---|---|---|
| oai-searchbot | No explicit Allow rules. | No explicit Disallow rules. |
| chatgpt-user | No explicit Allow rules. | No explicit Disallow rules. |
| perplexitybot | No explicit Allow rules. | No explicit Disallow rules. |
| firecrawlagent | No explicit Allow rules. | No explicit Disallow rules. |
| andibot | No explicit Allow rules. | No explicit Disallow rules. |
| exabot | No explicit Allow rules. | No explicit Disallow rules. |
| phindbot | No explicit Allow rules. | No explicit Disallow rules. |
| youbot |
|
No explicit Disallow rules. |
| gptbot | No explicit Allow rules. | No explicit Disallow rules. |
| ccbot | No explicit Allow rules. | No explicit Disallow rules. |
| google-extended | No explicit Allow rules. |
|
| googlebot | No explicit Allow rules. | No explicit Disallow rules. |
| bingbot |
|
|
| mediapartners-google |
|
No explicit Disallow rules. |
| * |
|
|
Blocked and Allowed Paths
| Blocked paths |
|
|---|---|
| Allowed paths |
|
| Crawl-delay | No Crawl-delay directive detected. |
Sitemaps Detected
No Sitemap directives found in robots.txt.
AI Crawler Policy
At least one AI crawler (such as GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot or Google-Extended) appears to be blocked by robots.txt.
Issues Found
- At least one user-agent has Disallow: / which blocks the entire site.
- robots.txt does not reference any sitemap URLs.
Recommendations
- Add a Sitemap directive in robots.txt pointing to your primary XML sitemap.
- Avoid blocking the entire site (Disallow: /); restrict only sensitive or low-value paths instead.
- Review your AI crawler policy for GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot and Google-Extended to ensure it matches your content strategy.
- Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.