Robots.txt Tester
Inspect how chainswap.io controls crawler access, blocked paths, sitemap references and AI crawler rules.
Preview
Score: 92
- Wildcard (*) patterns should typically be used with explicit paths, e.g. /path/*.
Get your full report + exact fixes
See what’s hurting your SEO and how to fix it step by step.
- Full breakdown
- Actionable fixes
- Prioritized next steps
Robots.txt Status
Robots.txt Status
Present
Score
92
/100
· Strong
View Full Robots.txt
Robots.txt Content Preview
User-agent: * Disallow: */forgot-password* Disallow: */password-reset* Disallow: /api/ Disallow: /500.html Disallow: */?__cf_* Disallow: /get-pair-rate Disallow: /get-exchange-amount User-agent: Bingbot Disallow: */forgot-password* Disallow: */password-reset* Disallow: /api/ Disallow: /500.html Disallow: */?__cf_* User-agent: Slurp Disallow: */forgot-password* Disallow: */password-reset* Disallow: /api/ Disallow: /500.html Disallow: */?__cf_* Sitemap: https://chainswap.io/sitemap.xml
User-agent Rules
| User-agent(s) | Allowed paths | Disallowed paths |
|---|---|---|
| * | No explicit Allow rules. |
|
| bingbot | No explicit Allow rules. |
|
| slurp | No explicit Allow rules. |
|
Blocked and Allowed Paths
| Blocked paths |
|
|---|---|
| Allowed paths | No explicit Allow paths detected. |
| Crawl-delay | No Crawl-delay directive detected. |
Sitemaps Detected
AI Crawler Policy
No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).
Issues Found
- Wildcard (*) patterns should typically be used with explicit paths, e.g. /path/*.
Recommendations
- Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
- Consider adding explicit Allow rules for important sections to clarify crawling intent for complex setups.
- Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.