Robots.txt Tester
Inspect how uspowders.com controls crawler access, blocked paths, sitemap references and AI crawler rules.
Preview
Score: 92
- At least one user-agent has Disallow: / which blocks the entire site.
Get your full report + exact fixes
See what’s hurting your SEO and how to fix it step by step.
- Full breakdown
- Actionable fixes
- Prioritized next steps
Robots.txt Status
Robots.txt Status
Present
Score
92
/100
· Strong
View Full Robots.txt
Robots.txt Content Preview
User-agent: MJ12bot Disallow: / User-agent: PetalBot Disallow: / User-agent: dotbot Disallow: / User-agent: SeekportBot Disallow: / User-agent: Googlebot Disallow: /supporters/perks/rss/ User-agent: AhrefsBot Crawl-Delay: 10 User-agent: AhrefsSiteAudit Crawl-Delay: 10 User-agent: Storebot-Google Allow: /checkout/* User-agent: * Allow: / Disallow: /admin Disallow: /localization Disallow: /supporters/pricing Disallow: /supporters/payments/checkout Disallow: /supporters/payments/checkout/* Disallow: /theme_editor Disallow: /theme_editor/* Disallow: /_c Disallow: /_c/* Disallow: /checkout/* Disallow: /order/* Disallow: /cart.js Sitemap: https://uspowders.com/sitemap.xml
User-agent Rules
| User-agent(s) | Allowed paths | Disallowed paths |
|---|---|---|
| mj12bot | No explicit Allow rules. |
|
| petalbot | No explicit Allow rules. |
|
| dotbot | No explicit Allow rules. |
|
| seekportbot | No explicit Allow rules. |
|
| googlebot | No explicit Allow rules. |
|
| ahrefsbot | No explicit Allow rules. | No explicit Disallow rules. |
| ahrefssiteaudit | No explicit Allow rules. | No explicit Disallow rules. |
| storebot-google |
|
No explicit Disallow rules. |
| * |
|
|
Blocked and Allowed Paths
| Blocked paths |
|
|---|---|
| Allowed paths |
|
| Crawl-delay | 10.0 seconds |
Sitemaps Detected
AI Crawler Policy
No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).
Issues Found
- At least one user-agent has Disallow: / which blocks the entire site.
Recommendations
- Avoid blocking the entire site (Disallow: /); restrict only sensitive or low-value paths instead.
- Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
- Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.