Robots.txt Tester

Inspect how chainswap.io controls crawler access, blocked paths, sitemap references and AI crawler rules.

Preview

Score: 92

  • Wildcard (*) patterns should typically be used with explicit paths, e.g. /path/*.

Get your full report + exact fixes

See what’s hurting your SEO and how to fix it step by step.

  • Full breakdown
  • Actionable fixes
  • Prioritized next steps

No spam. One email with your report and next steps.

Robots.txt Status

Robots.txt Status Present Score 92 /100 · Strong
Domain chainswap.io
Last analyzed May 13, 2026

View Full Robots.txt

Robots.txt Content Preview

User-agent: *
Disallow: */forgot-password*
Disallow: */password-reset*
Disallow: /api/
Disallow: /500.html
Disallow: */?__cf_*
Disallow: /get-pair-rate
Disallow: /get-exchange-amount

User-agent: Bingbot
Disallow: */forgot-password*
Disallow: */password-reset*
Disallow: /api/
Disallow: /500.html
Disallow: */?__cf_*

User-agent: Slurp
Disallow: */forgot-password*
Disallow: */password-reset*
Disallow: /api/
Disallow: /500.html
Disallow: */?__cf_*

Sitemap: https://chainswap.io/sitemap.xml

User-agent Rules

User-agent(s) Allowed paths Disallowed paths
* No explicit Allow rules.
  • */forgot-password*
  • */password-reset*
  • /api/
  • /500.html
  • */?__cf_*
  • /get-pair-rate
  • /get-exchange-amount
bingbot No explicit Allow rules.
  • */forgot-password*
  • */password-reset*
  • /api/
  • /500.html
  • */?__cf_*
slurp No explicit Allow rules.
  • */forgot-password*
  • */password-reset*
  • /api/
  • /500.html
  • */?__cf_*

Blocked and Allowed Paths

Blocked paths
  • */forgot-password*
  • */password-reset*
  • /api/
  • /500.html
  • */?__cf_*
  • /get-pair-rate
  • /get-exchange-amount
Allowed paths No explicit Allow paths detected.
Crawl-delay No Crawl-delay directive detected.

Sitemaps Detected

AI Crawler Policy

No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).

Issues Found

  • Wildcard (*) patterns should typically be used with explicit paths, e.g. /path/*.

Recommendations

  • Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
  • Consider adding explicit Allow rules for important sections to clarify crawling intent for complex setups.
  • Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.

Analyze this site with other tools

Want a website that actually generates leads?

Start a conversion-focused website project with a team that builds fast, SEO-optimized sites for real businesses.