Robots.txt Tester
Inspect how vnlibs.com controls crawler access, blocked paths, sitemap references and AI crawler rules.
Preview
Score: 100
- Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
- Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.
Get your full report + exact fixes
See what’s hurting your SEO and how to fix it step by step.
- Full breakdown
- Actionable fixes
- Prioritized next steps
Robots.txt Status
Robots.txt Status
Present
Score
100
/100
· Strong
View Full Robots.txt
Robots.txt Content Preview
User-agent: * # 1. Block Internal Search Results (SEO Best Practice to save Crawl Budget) Disallow: /search Disallow: /search/ Disallow: /search.html Disallow: /*?q= Disallow: /*&q= Disallow: /404.html Disallow: /404 # 2. Block Tracking Parameters (Prevent Duplicate Content) Disallow: /*?*utm_ Disallow: /*?*utm-source Disallow: /*?*utm_medium Disallow: /*?*fbclid Disallow: /*?*gclid Disallow: /*?*ref Disallow: /*?*referrer Disallow: /*?*session Disallow: /*?*share= Disallow: /*?*filter= Disallow: /*?*sort= Disallow: /*?*lang= # 3. Explicitly Allow Assets & Main Content (Ensure Google can render the page) Allow: / Allow: /assets/ Allow: /images/ Allow: /*.css$ Allow: /*.js$ Allow: /*.png$ Allow: /*.jpg$ Allow: /*.jpeg$ Allow: /*.gif$ Allow: /*.svg$ Allow: /*.webp$ Allow: /*.json$ Allow: /*.xml$ # 4. Sitemap Location Sitemap: https://vnlibs.com/sitemap.xml
User-agent Rules
| User-agent(s) | Allowed paths | Disallowed paths |
|---|---|---|
| * |
|
|
Blocked and Allowed Paths
| Blocked paths |
|
|---|---|
| Allowed paths |
|
| Crawl-delay | No Crawl-delay directive detected. |
Sitemaps Detected
AI Crawler Policy
No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).
Recommendations
- Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
- Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.