Robots.txt Tester

Inspect how docs.google.com controls crawler access, blocked paths, sitemap references and AI crawler rules.

Preview

Score: 84

  • At least one user-agent has Disallow: / which blocks the entire site.
  • robots.txt does not reference any sitemap URLs.

Get your full report + exact fixes

See what’s hurting your SEO and how to fix it step by step.

  • Full breakdown
  • Actionable fixes
  • Prioritized next steps

No spam. One email with your report and next steps.

Robots.txt Status

Robots.txt Status Present Score 84 /100 · Strong
Domain docs.google.com
Last analyzed May 14, 2026

View Full Robots.txt

Robots.txt Content Preview

User-agent: *
Crawl-delay: 1
Allow: /$
Allow: /?hl=
Disallow: /?hl=*&
Allow: /support/
Allow: /a/
Allow: /Doc
Allow: /View
Allow: /ViewDoc
Allow: /present
Allow: /Present
Allow: /TeamPresent
Allow: /EmbedSlideshow
Allow: /presentation
Allow: /templates
Allow: /previewtemplate
Allow: /fileview
Allow: /gview
Allow: /viewer
Allow: /leaf
Allow: /file
Allow: /open
Allow: /document
Allow: /drawings
Allow: /demo
Allow: /folder
Allow: /start
Allow: /spreadsheet
Allow: /forms
Allow: /macros
Allow: /keep
Allow: /static
Allow: /drive/
Allow: /videos
Disallow: /templateabuse
Disallow: /

User-agent Rules

User-agent(s) Allowed paths Disallowed paths
*
  • /$
  • /?hl=
  • /support/
  • /a/
  • /doc
  • /view
  • /viewdoc
  • /present
  • /present
  • /teampresent
  • /embedslideshow
  • /presentation
  • /templates
  • /previewtemplate
  • /fileview
  • /gview
  • /viewer
  • /leaf
  • /file
  • /open
  • /document
  • /drawings
  • /demo
  • /folder
  • /start
  • /spreadsheet
  • /forms
  • /macros
  • /keep
  • /static
  • /drive/
  • /videos
  • /?hl=*&
  • /templateabuse
  • /

Blocked and Allowed Paths

Blocked paths
  • /?hl=*&
  • /templateabuse
  • /
Allowed paths
  • /$
  • /?hl=
  • /support/
  • /a/
  • /doc
  • /view
  • /viewdoc
  • /present
  • /teampresent
  • /embedslideshow
  • /presentation
  • /templates
  • /previewtemplate
  • /fileview
  • /gview
  • /viewer
  • /leaf
  • /file
  • /open
  • /document
  • /drawings
  • /demo
  • /folder
  • /start
  • /spreadsheet
  • /forms
  • /macros
  • /keep
  • /static
  • /drive/
  • /videos
Crawl-delay 1.0 seconds

Sitemaps Detected

No Sitemap directives found in robots.txt.

AI Crawler Policy

No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).

Issues Found

  • At least one user-agent has Disallow: / which blocks the entire site.
  • robots.txt does not reference any sitemap URLs.

Recommendations

  • Add a Sitemap directive in robots.txt pointing to your primary XML sitemap.
  • Avoid blocking the entire site (Disallow: /); restrict only sensitive or low-value paths instead.
  • Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
  • Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.

Analyze this site with other tools

Want a website that actually generates leads?

Start a conversion-focused website project with a team that builds fast, SEO-optimized sites for real businesses.