AgentAudit is not a GEO tool. We do not help you "show up in ChatGPT" or get mentioned in AI summaries. That is a different problem. AgentAudit solves a specific, concrete infrastructure problem: can an AI agent actually navigate your website?
AI agents — autonomous programs that browse the web on behalf of users — are increasingly being used to book appointments, compare services, submit forms, and complete purchases. They work by fetching HTML, reading structured data, and identifying interactive elements. If your site was built for human browsers with JavaScript-rendered content and third-party booking widgets, agents fail silently.
Your potential customers never find out their AI assistant couldn't complete the task. They just get "I couldn't complete that booking" — and move on to a competitor whose site works.
Fetches your robots.txt and checks whether GPTBot, ClaudeBot, PerplexityBot, and other AI crawlers are explicitly allowed. Many sites accidentally block these bots while intending only to block spam crawlers. Weighted most heavily because a blocked crawler means zero access, regardless of how well-built the rest of your site is.
Visits your page twice — once with a real browser (JavaScript on) and once without. Compares the word count to calculate what percentage of your content requires JavaScript to appear. If that number is above 30%, a significant portion of your site is invisible to agents fetching raw HTML. Screenshots of both views are captured so you can see exactly what an agent sees.
Parses your raw HTML and checks for proper use of landmark elements: <main>, <nav>, <article>, <section>, <header>, <footer>. Also validates heading hierarchy (H1 → H2 → H3 without skipping levels). These elements are how agents orient themselves within a page — without them, a page is a blob of text.
Looks for JSON-LD structured data and microdata attributes. Schema markup tells AI agents what type of entity your site is, what services you offer, your location, your pricing — information they need to act on your behalf. We infer your business type from page content and generate appropriate markup if none is found.
Checks whether your forms — booking, contact, checkout — exist in the initial HTML response or only appear after JavaScript runs. Forms that require JS execution are invisible to agents that parse raw HTML. Also detects third-party booking forms served via iframe (OpenTable, Calendly, Vagaro, etc.) and provides specific guidance for each.
Checks whether your site has a /llms.txt file — a plain text summary of your site's purpose, key pages, and services designed specifically for AI agents. Similar to robots.txt but for LLMs rather than web crawlers. We generate a complete, ready-to-deploy llms.txt based on your page content and navigation.
Checks for meta title, meta description, canonical URL, and Open Graph tags. These basic signals help agents correctly identify and describe your page. Optionally uses the Google PageSpeed Insights API for a full performance score if PAGESPEED_API_KEY is configured.
AI agents helping users choose, compare, and sign up for software need to navigate your pricing page, read your feature descriptions, and find your signup form — all without JavaScript.
Booking, reservation, and contact forms need to be discoverable and completable by agents acting on behalf of users. If your booking widget is a JavaScript-only iframe from a third-party platform, agents need alternative access paths.
Product pages, prices, availability, and checkout flows need to be readable from raw HTML. Schema markup for Product, Offer, and availability is critical for agents comparing options for users.
Legal, medical, and consulting firms need accurate schema markup describing their services and location so agents can correctly identify and recommend them.
Questions? etmanski.business@gmail.com