Lintai is an experimental AI-aware static-analysis tool that spots LLM-specific security bugs (prompt-injection, insecure output handling, data-leakage …) before code ships.
Why Lintai? | What it does |
---|---|
Traditional SAST can’t “see” how you build prompts, stream completions or store vectors. | Lintai walks your AST, tags every AI sink (OpenAI, Anthropic, LangChain, …), follows wrapper chains, then asks an LLM to judge risk. |
Requires Python ≥ 3.10
- Two analysis commands
lintai ai-inventory <src-code-path>
– list every AI call and its caller chainlintai scan <src-code-path>
– run all detectors, emit JSON (with llm_usage summary)
- LLM budget guard-rails – hard caps on requests / tokens / cost (
LINTAI_MAX_LLM_*
) - Modular detector registry (
entry_points
) - OWASP LLM Top-10 & MITRE ATT&CK baked in
- DSL for custom rules
- CI-friendly JSON output (SARIF soon)
A React/Cytoscape UI is under active development – not shipped in this cut.
pip install lintai # core only
pip install "lintai[openai]" # + OpenAI detectors
# or "lintai[anthropic]" "lintai[gemini]" "lintai[cohere]"
pip install "lintai[ui]" # FastAPI server extras
# .env (minimal)
LINTAI_LLM_PROVIDER=openai # azure / anthropic / gemini / cohere / dummy
LLM_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxx # API key for above provider
# provider-specific knobs
LLM_MODEL_NAME=gpt-4.1-mini
LLM_ENDPOINT_URL=https://api.openai.com/v1/
LLM_API_VERSION=2025-01-01-preview # Required for Azure
# hard budget caps
LINTAI_MAX_LLM_TOKENS=50000
LINTAI_MAX_LLM_COST_USD=10
LINTAI_MAX_LLM_REQUESTS=500
Lintai auto-loads .env
; the UI writes the same file, so CLI & browser stay in sync.
lintai ai-inventory src/ --ai-call-depth 4
lintai scan src/
lintai ui # REST docs at http://localhost:8501/api/docs
LLM-powered rules collect the full source of functions that call AI frameworks, plus their caller chain, and ask an external LLM to classify OWASP risks.
Budget checks run before the call; actual usage is recorded afterwards.
Flag | Description |
---|---|
-l DEBUG |
Verbose logging |
--ruleset <dir> |
Load custom YAML/JSON rules |
--output <file> |
Write full JSON report instead of stdout |
{
"llm_usage": {
"tokens_used": 3544,
"usd_used": 0.11,
"requests": 6,
"limits": { "tokens": 50000, "usd": 10, "requests": 500 }
},
"findings": [
{
"owasp_id": "LLM01",
"severity": "blocker",
"location": "services/chat.py:57",
"message": "User-tainted f-string used in prompt",
"fix": "Wrap variable in escape_braces()"
}
]
}
lintai/ ├── cli.py Typer entry-point ├── engine/ AST walker & AI-call analysis ├── detectors/ Static & LLM-backed rules ├── dsl/ Custom rule loader ├── llm/ Provider clients & token-budget manager ├── components/ Maps common AI frameworks → canonical types ├── core/ Finding & report model ├── ui/ FastAPI backend (+ React UI coming soon) └── tests/ Unit / integration tests
examples/ Sample code with insecure AI usage
Method & path | Body / Params | Purpose |
---|---|---|
GET /api/health |
– | Liveness probe |
GET /api/config |
– | Read current config |
POST /api/config |
ConfigModel JSON |
Update settings (path, depth …) |
GET /POST /api/env |
EnvPayload JSON |
Read / update non-secret .env |
POST /api/secrets |
SecretPayload JSON |
Store API key (write-only) |
POST /api/scan |
multipart files | Run detectors on uploaded code |
POST /api/inventory |
path=<dir> |
Inventory run on server-side folder |
GET /api/runs |
– | List all runs + status |
GET /api/results/{id} |
– | Fetch scan / inventory report |
Auto-generated OpenAPI docs live at /api/docs
.
- React JS UI support
- SARIF + GitHub Actions template
- Additional AI frameworks recognition and categorization
- Lintai VS Code extension
- Live taint-tracking
- Star the repo ⭐
git checkout -b feat/my-fix
pytest -q
(all green)- Open a PR – or a draft PR early
- See
CONTRIBUTING.md
The UI is a React/TypeScript application. For development:
# Frontend development
cd lintai/ui/frontend
npm install
npm run dev # Start dev server
# Build for production (development only)
python scripts/build-frontend.py
Note: Built frontend assets are not committed to git. They are built automatically during CI/CD for releases.
Created by Harsh Parandekar — LinkedIn Licensed under Apache 2.0