Make any website readable by agents in about an hour. Templates, validation, and a checklist.
In 2026, a lot of traffic comes from crawlers and agents, not browsers. Most sites aren't set up for them. This repo fixes that.
Publish a few machine-readable surfaces at known paths. An agent lands, parses, and acts in one shot — no scraping, no guessing.
Templates and a checklist. About an hour to set up.
templates/agent.json— the identity card, with placeholder fieldstemplates/llms.txt— the narrative, in the structure agents expecttemplates/robots.txt— explicit allowlist for the 12 named AI crawlerstemplates/sitemap.xml— a sitemap stubtemplates/jsonld/— three JSON-LD blocks (ProfessionalService, SoftwareApplication, SoftwareSourceCode)examples/— Next.js 16 App Router route handlers, ready to pasteembed/— a badge and an HTML snippet adopters can drop indocs/— pattern overview, product-vs-service guidance, voice rules, validation checklist
Drop four files into public/ and the site is ~80% there.
public/
├── .well-known/
│ └── agent.json # from templates/agent.json
├── llms.txt # from templates/llms.txt
├── robots.txt # from templates/robots.txt
└── sitemap.xml # from templates/sitemap.xml
- Copy the four files above into the site's
public/directory. - Replace every
{{PLACEHOLDER}}with a real value. Every one. - Add the homepage JSON-LD block from
templates/jsonld/professional-service.json(orsoftware-application.jsonfor a product) inside the page<head>. - Deploy.
- Run the checklist in
docs/validation-checklist.mdagainst the live URL.
That's it. Optional endpoints (/api/agent/capabilities, /api/agent/simulate/*) are in examples/ if the site needs to be callable, not just readable.
- reframed.works — canonical product-surface deploy. Verify live:
Point your own deploy at @notaprompt on GitHub to get it added here.
llms.txt is a file. This is a pattern that includes it.
llms.txt gives an LLM a readable summary of the site. Agent-seo layers three additions on top:
/.well-known/agent.json— a structured identity card at a standard path, so an agent can discover capabilities and contact routes without parsing markdown.- JSON-LD blocks — Schema.org entities in the page
<head>, so classic search engines and rich-result crawlers see the same identityllms.txtdescribes. - A named AI-bot allowlist — explicit
robots.txtrules for the 12 crawlers worth naming, so the site is not accidentally excluded by a wildcard rule copied from a WordPress template.
Adopting this pattern does not mean abandoning llms.txt. It means wrapping it in the surrounding surfaces agents already look for.
Sites adopting the pattern can signal it two ways.
In a README:
[](https://github.com/notaprompt/agent-seo)In a deployed site's <head>:
<link rel="alternate" type="application/agent+json" href="/.well-known/agent.json" title="Agent surface">
<link rel="alternate" type="text/plain" href="/llms.txt" title="LLM-readable site description">Both live in embed/ with usage notes.
The pattern is machine-readable, but humans still read the copy inside it. Every template here follows the voice rules in docs/voice-rules.md:
- No first-person plural.
- No AI buzzwords.
- Direct. Warm but lean. Every word earns its seat.
A one-line grep that catches most violations:
grep -niE 'AI-powered|intelligent|smart|leverage|synergy|seamless|cutting-edge' FILEAfter deploying, from a fresh terminal:
curl -sS https://SITE/.well-known/agent.json | jq .
curl -sS https://SITE/llms.txt | head -40
curl -sS -A "GPTBot/1.0" -o /dev/null -w "%{http_code}\n" https://SITE/
curl -sS -A "ClaudeBot/1.0" -o /dev/null -w "%{http_code}\n" https://SITE/
curl -sS -A "PerplexityBot/1.0" -o /dev/null -w "%{http_code}\n" https://SITE/The full matrix (12 user agents, per-file expectations) is in docs/validation-checklist.md.
agent-seo/
├── README.md
├── LICENSE
├── CONTRIBUTING.md
├── llms.txt # the repo dogfoods the pattern
├── public/.well-known/agent.json # also dogfooded
├── templates/
│ ├── agent.json
│ ├── llms.txt
│ ├── robots.txt
│ ├── sitemap.xml
│ └── jsonld/
│ ├── professional-service.json
│ ├── software-application.json
│ └── software-source-code.json
├── examples/
│ ├── nextjs-route-handler/
│ │ └── api-agent-capabilities-route.ts
│ └── api-agent-simulate-stub.ts
├── docs/
│ ├── pattern-overview.md
│ ├── product-vs-service.md
│ ├── voice-rules.md
│ └── validation-checklist.md
└── embed/
├── badge.md
├── footer-snippet.html
└── README.md
Alexander Campos. campos.works
MIT. See LICENSE.