Robots.txt Tester

Inspect how uspowders.com controls crawler access, blocked paths, sitemap references and AI crawler rules.

Preview

Score: 92

  • At least one user-agent has Disallow: / which blocks the entire site.

Get your full report + exact fixes

See what’s hurting your SEO and how to fix it step by step.

  • Full breakdown
  • Actionable fixes
  • Prioritized next steps

No spam. One email with your report and next steps.

Robots.txt Status

Robots.txt Status Present Score 92 /100 · Strong
Domain uspowders.com
Last analyzed May 13, 2026

View Full Robots.txt

Robots.txt Content Preview

User-agent: MJ12bot
Disallow: /

User-agent: PetalBot
Disallow: /

User-agent: dotbot
Disallow: /

User-agent: SeekportBot
Disallow: /

User-agent: Googlebot
Disallow: /supporters/perks/rss/

User-agent: AhrefsBot
Crawl-Delay: 10

User-agent: AhrefsSiteAudit
Crawl-Delay: 10

User-agent: Storebot-Google
Allow: /checkout/*

User-agent: *
Allow: /
Disallow: /admin
Disallow: /localization
Disallow: /supporters/pricing
Disallow: /supporters/payments/checkout
Disallow: /supporters/payments/checkout/*
Disallow: /theme_editor
Disallow: /theme_editor/*
Disallow: /_c
Disallow: /_c/*
Disallow: /checkout/*
Disallow: /order/*
Disallow: /cart.js

Sitemap: https://uspowders.com/sitemap.xml

User-agent Rules

User-agent(s) Allowed paths Disallowed paths
mj12bot No explicit Allow rules.
  • /
petalbot No explicit Allow rules.
  • /
dotbot No explicit Allow rules.
  • /
seekportbot No explicit Allow rules.
  • /
googlebot No explicit Allow rules.
  • /supporters/perks/rss/
ahrefsbot No explicit Allow rules. No explicit Disallow rules.
ahrefssiteaudit No explicit Allow rules. No explicit Disallow rules.
storebot-google
  • /checkout/*
No explicit Disallow rules.
*
  • /
  • /admin
  • /localization
  • /supporters/pricing
  • /supporters/payments/checkout
  • /supporters/payments/checkout/*
  • /theme_editor
  • /theme_editor/*
  • /_c
  • /_c/*
  • /checkout/*
  • /order/*
  • /cart.js

Blocked and Allowed Paths

Blocked paths
  • /
  • /supporters/perks/rss/
  • /admin
  • /localization
  • /supporters/pricing
  • /supporters/payments/checkout
  • /supporters/payments/checkout/*
  • /theme_editor
  • /theme_editor/*
  • /_c
  • /_c/*
  • /checkout/*
  • /order/*
  • /cart.js
Allowed paths
  • /checkout/*
  • /
Crawl-delay 10.0 seconds

Sitemaps Detected

AI Crawler Policy

No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).

Issues Found

  • At least one user-agent has Disallow: / which blocks the entire site.

Recommendations

  • Avoid blocking the entire site (Disallow: /); restrict only sensitive or low-value paths instead.
  • Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
  • Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.

Analyze this site with other tools

Want a website that actually generates leads?

Start a conversion-focused website project with a team that builds fast, SEO-optimized sites for real businesses.