Robots.txt Tester

Inspect how vnlibs.com controls crawler access, blocked paths, sitemap references and AI crawler rules.

Preview

Score: 100

  • Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
  • Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.

Get your full report + exact fixes

See what’s hurting your SEO and how to fix it step by step.

  • Full breakdown
  • Actionable fixes
  • Prioritized next steps

No spam. One email with your report and next steps.

Robots.txt Status

Robots.txt Status Present Score 100 /100 · Strong
Domain vnlibs.com
Last analyzed May 15, 2026

View Full Robots.txt

Robots.txt Content Preview

User-agent: *

# 1. Block Internal Search Results (SEO Best Practice to save Crawl Budget)
Disallow: /search
Disallow: /search/
Disallow: /search.html
Disallow: /*?q=
Disallow: /*&q=
Disallow: /404.html
Disallow: /404

# 2. Block Tracking Parameters (Prevent Duplicate Content)
Disallow: /*?*utm_
Disallow: /*?*utm-source
Disallow: /*?*utm_medium
Disallow: /*?*fbclid
Disallow: /*?*gclid
Disallow: /*?*ref
Disallow: /*?*referrer
Disallow: /*?*session
Disallow: /*?*share=
Disallow: /*?*filter=
Disallow: /*?*sort=
Disallow: /*?*lang=

# 3. Explicitly Allow Assets & Main Content (Ensure Google can render the page)
Allow: /
Allow: /assets/
Allow: /images/
Allow: /*.css$
Allow: /*.js$
Allow: /*.png$
Allow: /*.jpg$
Allow: /*.jpeg$
Allow: /*.gif$
Allow: /*.svg$
Allow: /*.webp$
Allow: /*.json$
Allow: /*.xml$

# 4. Sitemap Location
Sitemap: https://vnlibs.com/sitemap.xml

User-agent Rules

User-agent(s) Allowed paths Disallowed paths
*
  • /
  • /assets/
  • /images/
  • /*.css$
  • /*.js$
  • /*.png$
  • /*.jpg$
  • /*.jpeg$
  • /*.gif$
  • /*.svg$
  • /*.webp$
  • /*.json$
  • /*.xml$
  • /search
  • /search/
  • /search.html
  • /*?q=
  • /*&q=
  • /404.html
  • /404
  • /*?*utm_
  • /*?*utm-source
  • /*?*utm_medium
  • /*?*fbclid
  • /*?*gclid
  • /*?*ref
  • /*?*referrer
  • /*?*session
  • /*?*share=
  • /*?*filter=
  • /*?*sort=
  • /*?*lang=

Blocked and Allowed Paths

Blocked paths
  • /search
  • /search/
  • /search.html
  • /*?q=
  • /*&q=
  • /404.html
  • /404
  • /*?*utm_
  • /*?*utm-source
  • /*?*utm_medium
  • /*?*fbclid
  • /*?*gclid
  • /*?*ref
  • /*?*referrer
  • /*?*session
  • /*?*share=
  • /*?*filter=
  • /*?*sort=
  • /*?*lang=
Allowed paths
  • /
  • /assets/
  • /images/
  • /*.css$
  • /*.js$
  • /*.png$
  • /*.jpg$
  • /*.jpeg$
  • /*.gif$
  • /*.svg$
  • /*.webp$
  • /*.json$
  • /*.xml$
Crawl-delay No Crawl-delay directive detected.

Sitemaps Detected

AI Crawler Policy

No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).

Recommendations

  • Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
  • Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.

Analyze this site with other tools

Want a website that actually generates leads?

Start a conversion-focused website project with a team that builds fast, SEO-optimized sites for real businesses.