AI crawler access audit

Check If AI Crawlers Can Access Your Website

Test GPTBot, ClaudeBot, PerplexityBot, Google-Extended, OAI-SearchBot, robots.txt, headers, meta robots, sitemap, and llms.txt in one transparent report.

What it checks

More than a robots.txt glance

A crawler policy can look open while a page still blocks discovery through headers, meta tags, redirects, thin rendered content, or missing AI-readable discovery files.

Check public crawler policy

Parse robots.txt for major AI search, training, and classic search crawlers at the exact URL path.

Inspect page-level signals

Review HTTP status, redirects, meta robots, X-Robots-Tag, canonical, readable text, and JSON-LD.

Find AI-readiness gaps

Check sitemap, llms.txt, llms-full.txt, and copy-ready fixes without promising guaranteed AI visibility.

Boundaries

A check, not a guarantee

This tool diagnoses public access signals that site owners control. It does not bypass bot defenses, log into websites, or promise citation in any AI answer surface.

Visibility mode

Find accidental blocks that may prevent AI search or retrieval systems from reading public pages.

Protection mode

See which training or retrieval bots you are allowing, then choose a policy intentionally.

Evidence first

Every recommendation ties back to a visible rule, header, tag, status, or missing public file.