You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Both targets are in the Foundation band, meaning the basics are in place but significant gaps exist in AI-specific discoverability signals. Core content quality is strong (84/100 citability on both), but infrastructure for AI crawlers is incomplete.
✅ Top Strengths
Docs site (github.github.com/gh-aw/)
🏆 Perfect meta tags (14/14) — title, description, canonical, Open Graph all present
🏆 Perfect signals (6/6) — language declared, RSS feed linked, freshness date present (2026-05-09)
🔒 Full technical trust (5/5) — HTTPS, HSTS, CSP, and X-Frame-Options all present
🚨 Critical Gaps
Docs site has no robots.txt (0/18) — AI crawlers (GPTBot, ClaudeBot, PerplexityBot) are not explicitly allowed on github.github.com/gh-aw/. Without this, crawlers may be uncertain about access.
Docs site has no llms.txt (0/18) — The docs site is missing the LLM-friendly summary file that the README already has. This is a direct gap in structured AI discoverability.
AI Discovery endpoints missing on both targets (0/6 each) — Neither site has:
/.well-known/ai.txt
/ai/summary.json
/ai/faq.json
/ai/service.json
README has no Schema JSON-LD (0/16) — The GitHub.com repository page lacks structured data (WebSite, Organization, FAQPage). This is a GitHub platform limitation, but worth noting as it severely limits Google AI Overview and knowledge graph inclusion.
Brand & entity signals weak on both (3/10 each) — No Wikipedia, Wikidata, LinkedIn, or Crunchbase sameAs links in schema; no Knowledge Graph pillar URLs.
🔧 Recommended Fixes
Ordered by impact vs. effort:
Priority
Fix
Target
Impact
🔴 P1
Create robots.txt on docs site allowing AI bots
Docs site
Robots: 0 → 18 pts
🔴 P1
Create /llms.txt on docs site (geo llms --base-url https://github.github.com/gh-aw)
Docs site
llms.txt: 0 → ~14 pts
🟠 P2
Create /.well-known/ai.txt on docs site
Docs site
AI discovery: +2 pts
🟠 P2
Create /ai/summary.json, /ai/faq.json, /ai/service.json on docs site
Docs site
AI discovery: 0 → 6 pts
🟠 P2
Fix docs site sitemap (/sitemap.xml returns 404) to enable sitemap auditing
Docs site
Sitemap coverage
🟡 P3
Add sameAs links (Wikipedia/Wikidata/LinkedIn) to Organization schema on docs site
Docs site
Brand/entity: +3 pts
🟡 P3
Fix docs site Schema JSON-LD: add @context to WebSite, Organization, and FAQPage schemas
Docs site
Schema: 7 → 12+ pts
🟡 P3
Add dateModified schema to README page
README
Signals: +1-2 pts
🟡 P3
Add canonical <link rel="canonical"> to README page
README
Meta: +1 pt
🔵 P4
Reduce keyword density of 'github' (3.7%) on docs site — diversify vocabulary
Docs site
Negative penalty
🔵 P4
Add WebApp / SoftwareApplication schema with potentialAction to docs site
Docs site
WebMCP readiness
📋 Full Breakdown by Category — Docs Site (38/100)
Category
Score
Max
Status
robots.txt
0
18
❌ File not found
llms.txt
0
18
❌ File not found
Schema JSON-LD
7
16
⚠️ Missing @context fields; no WebApp schema
Meta Tags
14
14
✅ Perfect
Content
11
12
✅ Missing numerical statistics
Signals
6
6
✅ Perfect
AI Discovery
0
6
❌ No well-known, summary, faq, or service endpoints
RAG Readiness: Chunk score 46/100 — avg section 23.4 words; heavy GitHub nav boilerplate dilutes content
Prompt Injection: Medium risk — hidden GitHub navigation text detected in display:none elements
📄 Sitemap Audit
https://github.github.com/gh-aw/sitemap.xml returned HTTP 404. No sitemap pages could be audited.
Recommended fix: Configure the Astro Starlight site to generate and serve a sitemap. Starlight supports this via the @astrojs/sitemap integration or Starlight's built-in sitemap support.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
GEO Audit Report — github/gh-aw
Audit Date: 2026-05-11
Run: §25685332477
📊 Scores
github.github.com/gh-aw/)github.com/github/gh-aw)Both targets are in the Foundation band, meaning the basics are in place but significant gaps exist in AI-specific discoverability signals. Core content quality is strong (84/100 citability on both), but infrastructure for AI crawlers is incomplete.
✅ Top Strengths
Docs site (
github.github.com/gh-aw/)2026-05-09)WebSite,Organization,SoftwareApplication, andFAQPagetypes detectedREADME (
github.com/github/gh-aw)🚨 Critical Gaps
Docs site has no
robots.txt(0/18) — AI crawlers (GPTBot, ClaudeBot, PerplexityBot) are not explicitly allowed ongithub.github.com/gh-aw/. Without this, crawlers may be uncertain about access.Docs site has no
llms.txt(0/18) — The docs site is missing the LLM-friendly summary file that the README already has. This is a direct gap in structured AI discoverability.AI Discovery endpoints missing on both targets (0/6 each) — Neither site has:
/.well-known/ai.txt/ai/summary.json/ai/faq.json/ai/service.jsonREADME has no Schema JSON-LD (0/16) — The GitHub.com repository page lacks structured data (WebSite, Organization, FAQPage). This is a GitHub platform limitation, but worth noting as it severely limits Google AI Overview and knowledge graph inclusion.
Brand & entity signals weak on both (3/10 each) — No Wikipedia, Wikidata, LinkedIn, or Crunchbase
sameAslinks in schema; no Knowledge Graph pillar URLs.🔧 Recommended Fixes
Ordered by impact vs. effort:
robots.txton docs site allowing AI bots/llms.txton docs site (geo llms --base-url https://github.github.com/gh-aw)/.well-known/ai.txton docs site/ai/summary.json,/ai/faq.json,/ai/service.jsonon docs site/sitemap.xmlreturns 404) to enable sitemap auditingsameAslinks (Wikipedia/Wikidata/LinkedIn) to Organization schema on docs site@contexttoWebSite,Organization, andFAQPageschemasdateModifiedschema to README page<link rel="canonical">to README pagepotentialActionto docs site📋 Full Breakdown by Category — Docs Site (38/100)
@contextfields; no WebApp schemaTrust Stack: Grade C (medium) — Technical: 3/5, Identity: 2/5, Social: 3/5, Academic: 1/5, Consistency: 4/5
Platform Citation Scores: ChatGPT 50/100 · Perplexity 65/100 · Google AI 70/100
RAG Readiness: Chunk score 45/100 — avg section 5.9 words (too short); 0/14 sections in optimal range
Prompt Injection: Medium risk — 1 hidden text pattern detected ("CtrlK" from keyboard shortcut UI)
📋 Full Breakdown by Category — README (55/100)
llms-full.txtand optional sectionsTrust Stack: Grade C (medium) — Technical: 5/5, Identity: 3/5, Social: 1/5, Academic: 4/5, Consistency: 3/5
Platform Citation Scores: ChatGPT 55/100 · Perplexity 70/100 · Google AI 45/100
Content Decay Risk: HIGH — evergreen score 30/100; 10 decay signals detected (version numbers, closed issue references, temporal language)
RAG Readiness: Chunk score 46/100 — avg section 23.4 words; heavy GitHub nav boilerplate dilutes content
Prompt Injection: Medium risk — hidden GitHub navigation text detected in
display:noneelements📄 Sitemap Audit
https://github.github.com/gh-aw/sitemap.xmlreturned HTTP 404. No sitemap pages could be audited.Recommended fix: Configure the Astro Starlight site to generate and serve a sitemap. Starlight supports this via the
@astrojs/sitemapintegration or Starlight's built-in sitemap support.Automated audit powered by geo-optimizer-skill · Run logs
Beta Was this translation helpful? Give feedback.
All reactions