Give your AI agent a GPS, not a map dump. Contexly tells agents exactly where to edit, what can break, and why.
Contexly extracts the logic skeleton of your codebase: function signatures, conditions, calls, returns, and impact paths.
Not raw code. The behavior map.
Most agent failures are not syntax bugs. They are navigation bugs. The model edits 2-3 obvious files, misses dependent files, then reports "done".
Contexly solves this by giving an execution-level map first, then targeted context.
What you get:
- Compact, searchable logic trees
- Context lookup by file, function, and behavior intent
- Fast CLI + MCP workflow for day-to-day coding tasks
# Agent gets task: "fix pivot hedge logic"
contexly query . "pivot hedge" 2 2Agent gets back:
price_monitor.py [L39-130] <- signal decided here
round_manager.py [L110-157] <- amount calculated here
round_manager.py [L164-443] <- order execution path
Impact: claim_manager.py reads pivot_count too
Agent edits exactly those files. Nothing else. No hallucinated edits, no missed dependency hops.
I was building a SaaS product on top of OpenClaw + n8n (roughly 700k lines combined).
As a solo developer using coding agents daily, I kept seeing the same failure: the agent patched 2-3 files and declared success, while real impact lived elsewhere.
Contexly was built to fix that exact failure mode.
Contexly surfaces patterns like duplicate execution functions, legacy overlap, and risky impact chains while building/querying context.
This helps agents avoid copy-paste regressions before they happen.
Ran on a real 13-file Python trading bot:
| Before | After | |
|---|---|---|
| Tokens sent to AI | 197,068 | 7,727 |
| Compression | - | 95.8% |
| AI reads raw code? | Every message | Never |
Same understanding. 25x fewer tokens.
| Language | Extensions | Parser |
|---|---|---|
| Python | .py |
tree-sitter |
| JavaScript | .js, .mjs |
tree-sitter |
| TypeScript | .ts, .tsx |
tree-sitter |
| Go | .go |
tree-sitter |
Files with unsupported extensions are skipped automatically.
pip install contexlyDev install (with test tools):
pip install -e ".[dev]"# 1) Build a logic tree
contexly tree .
# 2) Get high-level repository map
contexly index . 0
# 3) Query relevant context
contexly query . "test query" 2 1
# 4) Open generated HTML tree
contexly view .
# Optional: force fresh tree build (ignore cached tree.json)
contexly --rebuild query . "test query" 2 1Initialize .contexly/ metadata for a project.
Build a logic skeleton tree and save outputs to:
.contexly/tree.json— machine-readable skeleton.contexly/tree.html— visual browser explorer
Example output:
Building logic tree for: my-api/
Files processed: 11
Raw token estimate: 84,310
Tree tokens: 3,920
Compression: 95.4% (21x smaller)
File roles:
ENTRY 2 file(s) main.go, server.go
CORE 5 file(s)
TEST 4 file(s)
Skeleton of one function (raw → compressed):
# Raw source (~42 tokens)
func ProcessOrder(ctx context.Context, order Order) error {
if order.Amount <= 0 {
return ErrInvalidAmount
}
user, err := db.GetUser(ctx, order.UserID)
if err != nil {
return err
}
return payments.Charge(ctx, user, order.Amount)
}
# Contexly skeleton (~11 tokens)
~ProcessOrder(ctx, order)[L24-36]
?if order.Amount <= 0
>db.GetUser()
>payments.Charge()
<err / nil
Print compact index from existing tree or create one if missing.
level=0-> repo maplevel=1-> file index (default)
Search context and build targeted result around matched files.
depth= how many dependency hops from matched files (1= direct links)level= output detail (1= index view,2= function skeletons)- Add
--rebuildbefore command to ignore cached tree and force fresh analysis.
Examples:
contexly query . "rate limiting" 1 2
contexly query . "auth flow" 2 1 --debugPreview downstream impact before editing a function.
Show current tree summary and compression info.
Open generated tree HTML in the browser.
Session commands for optional progress tracking:
session new <path> "Task name"session update <path> <done|in_progress|todo> "text"session step <path> "completed" "next"session status <path>
Contexly provides an MCP server for Claude, Copilot, Cursor, Continue, Windsurf, and other MCP-compatible clients.
Start MCP server:
contexly-mcpor
python contexly_mcp.pySee full setup examples in MCP_SETUP.md and mcp.example.json.
- AGENT_REFERENCE.md: complete agent workflow guide
- MCP_TOOL_REFERENCE.md: quick MCP signatures and examples
- REPO_STRUCTURE.md: repository and module structure
- DOCUMENTATION_INDEX.md: documentation navigation
- MCP_SETUP.md: client integration setup
- React/TSX extraction is much better now, but highly dynamic component patterns can still produce partial skeletons.
- Import connection quality depends on resolver hints (
tsconfig.jsonpaths, Vite aliases, re-export patterns). - If output looks stale or unexpectedly thin, use
--rebuildto bypass cache and regenerate context.
If you hit a bad case, open an issue with a minimal repro project. That helps improve parser coverage quickly.
- v0.2.0 - VS Code extension (interactive context + impact in editor)
- v0.2.0 - Rust and Java support
- v0.3.0 - Cloud context sync
Please open an issue or pull request with clear reproduction steps and expected behavior.
Distributed under the MIT License. See LICENSE.