Skip to content

ppcvote/avs-standard

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Visibility Score (AVS)

The first open standard for measuring website discoverability by AI search engines.

Spec DOI License: CC BY 4.0 Reference Implementation


What is AVS?

AI Visibility Score (AVS) measures how discoverable your website is to both traditional search engines (Google, Bing) and AI-powered search engines (ChatGPT, Perplexity, Gemini, Copilot).

AVS = SEO Score × 0.5 + AEO Score × 0.5

One number. 0-100. Grade A-F. Tells you: "Can AI find you?"

Why it matters

In 2026, a growing share of web traffic comes from AI search interfaces. Users ask ChatGPT "recommend a good restaurant" instead of Googling. If your website isn't optimized for AI retrieval, you're invisible to this new channel.

No standardized, free, open metric existed to measure this. Until now.

How it works

Component What it measures Checks Cost
SEO Score Traditional search engine optimization 30+ checks, 8 categories $0
AEO Score AI search engine optimization 32+ checks, 8 categories $0
AVS Combined AI Visibility SEO + AEO weighted average $0

The entire analysis is deterministic (pure HTML parsing, no LLM calls), completes in < 50ms, and costs $0.

Quick Start

Scan any website (free)

Web UI: ultralab.tw/probe

Use the reference implementation

npm install @ultralab/scanners
import { runSeoScan, runAeoScan } from '@ultralab/scanners'

const html = await fetch('https://example.com').then(r => r.text())
const seo = runSeoScan(html, 'https://example.com')
const aeo = runAeoScan(html, 'https://example.com')
const avs = Math.round(seo.score * 0.5 + aeo.score * 0.5)

console.log(`AVS: ${avs}/100`) // AVS: 47/100

Grade Scale

Grade Score Meaning
A 90-100 Highly visible to Google AND AI
B 75-89 Good SEO, some AEO gaps
C 60-74 Fair. Competitors may be preferred by AI
D 45-59 Poor. Largely invisible to AI search
F 0-29 Invisible. Immediate action required

Specification

Full specification: AVS v1.0

Covers:

  • Scoring formula and grade mapping
  • SEO sub-score: 8 categories, 30+ checks
  • AEO sub-score: 8 categories, 32+ checks
  • Measurement protocol
  • Validation study methodology

Validation

We are conducting an empirical validation study:

  • 250 queries submitted to AI search engines (OpenAI web_search)
  • Cited URLs scanned with AVS reference implementation
  • Correlation analysis between AVS scores and AI citation behavior

Key findings (155 queries, 816 citations, 721 AVS scores):

  • Median AVS of cited websites: 77 (Grade B)
  • 59.8% of cited sites scored B or above
  • Recommendation queries cite highest-AVS sites (mean 80.2)
  • Local queries cite lowest-AVS sites (mean 60.0)
  • SEO-AEO gap: mean SEO 80.6 vs mean AEO 64.5

Paper: DOI: 10.5281/zenodo.19410475

Comparison with Existing Standards

Standard What it measures Scope Open Free
Lighthouse Web performance + SEO basics Google-centric
Core Web Vitals Page experience metrics Google-centric
CVSS Vulnerability severity Security
OWASP Top 10 Security risk categories Security
ATR (PanGuard) AI agent threat rules Agent security
AVS AI search visibility SEO + AI search

AVS fills the gap between traditional SEO metrics and the emerging AI search landscape.

Contributing

AVS is designed to be community-owned. We welcome:

  • New AEO checks: Propose checks that improve AI citation prediction
  • Validation studies: Run independent studies with different AI engines
  • Language support: Test AVS with non-English queries
  • Weight calibration: Help us optimize the scoring weights with data

See CONTRIBUTING.md for guidelines.

Links

License


AVS is an open standard initiated by Ultra Lab. It is not affiliated with Google, OpenAI, Anthropic, or any other AI company.

About

AI Visibility Score (AVS) — Open standard for measuring website discoverability by AI search engines. Spec + reference implementation + validation data.

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors