Context-efficient API references for LLM toolchains — structured, on-demand, not always-on.
libcontext inspects any installed Python package via static AST analysis (no code execution) and generates compact Markdown API references. It integrates with Claude Code, GitHub Copilot (via a /lib skill), and VS Code / Cursor (via an MCP server) to provide progressive disclosure — only loading API context when you actually need it, avoiding context window pollution.
LLMs can often use popular libraries correctly from training data alone. For well-known packages like requests or flask, libcontext adds little value. The real problems arise in specific scenarios:
- Internal / private libraries — Zero training data exists. The model has never seen the API.
- Niche open-source packages — Sparse or outdated training data leads to hallucinated methods and wrong signatures. GPT-4o achieves only 38% valid invocations on low-frequency APIs (Amazon Science, ICSE 2025).
- New versions of any library — Training data has a cutoff. The model knows v2, you're using v3.
Even when an LLM could read source files directly, structured API summaries are more context-efficient: on niche libraries (Polars, Ibis, GeoPandas, Ivy), providing API documentation via retrieval improves pass rates by 83–220% compared to no documentation, while consuming far fewer tokens than raw source code (arXiv 2503.15231, March 2025).
Dumping entire API references into always-on instruction files wastes context window on every interaction. Selective retrieval outperforms always-on injection — indiscriminate documentation retrieval causes up to a 39% absolute performance drop on well-known APIs where the model already has strong training data (Amazon Science, ICSE 2025).
libcontext addresses this with progressive disclosure: overview first, then drill into specific modules only when needed.
| Scenario | Impact | Why |
|---|---|---|
| Internal / private libraries | Critical | Zero training data — the model has never seen the API |
| Niche open-source packages | High | Sparse training data; across 16 LLMs, 19.7% of generated package imports reference hallucinated packages — 5.2% for commercial models, 21.7% for open-source (USENIX Security 2025) |
| New versions of any library | High | Training cutoff — the LLM knows v2, you're using v3 |
| Popular, stable libraries | Low | The LLM already has good knowledge from training data — libcontext adds little here |
- Replace reading source code — LLMs with tool access (Claude Code, Cursor) can read files directly. For popular libraries, that's often sufficient.
- Guarantee correctness — Even with perfect API docs, LLMs still make errors. Research shows pass rates of 48–92% with target documentation depending on the model, not 100% (arXiv 2503.15231).
- Provide usage examples — libcontext extracts signatures and docstrings, not example code. Research shows example code has the highest impact on generation quality — removing examples causes the largest performance drops (arXiv 2503.15231).
# Install globally with uv (recommended — available in all projects)
uv tool install libcontext
# Install the /lib skill into your Claude Code project
libctx install --skills
# Now in Claude Code, just type:
# /lib requests
# Claude will progressively discover the API for youFor VS Code with MCP support:
uv tool install "libcontext[mcp]"
libctx install --mcp --target vscodeInstead of dumping everything upfront, libcontext follows a progressive workflow:
Step 1: Overview Step 2: Drill down Step 3: Search
libctx inspect requests libctx inspect requests libctx inspect requests
--overview --module requests.api --search Session
Module list with Full signatures, Find specific
class/function names docstrings, parameters classes or methods
(no signatures) for one module across all modules
The /lib skill (Claude Code, GitHub Copilot) and MCP server (Claude Code, VS Code, Cursor) automate this workflow — the AI assistant decides what to inspect based on the task at hand.
# Full API reference to stdout
libctx inspect requests
# Compact overview — module names with class/function names
libctx inspect requests --overview -q
# Detailed API for a single module
libctx inspect requests --module requests.api -q
# Search for a specific class or function
libctx inspect requests --search Session -q
# Write to a file with marker injection
libctx inspect requests -o .github/copilot-instructions.md
# Multiple libraries at once
libctx inspect requests httpx pydantic -o context.md
# JSON output (programmatic consumption)
libctx inspect requests --format json
libctx inspect requests --search Session --format json
# Filter search by type
libctx inspect requests --search Session --type class -q
# Compare two API snapshots
libctx inspect requests --format json > old.json
# ... upgrade requests ...
libctx inspect requests --format json > new.json
libctx diff old.json new.json
# Bypass disk cache
libctx inspect requests --no-cache
# Cache management
libctx cache list # show cached packages with size and age
libctx cache clear # clear all cached API data
libctx cache clear requests # clear only the entries for one package- Parsing — Reads
.pyand.pyisource files using Python'sastmodule. No code is ever executed. - Stub merging — Discovers colocated and standalone stub packages; merges signatures from stubs with docstrings from sources.
- Extraction — Classes, functions, methods, parameters, type annotations, decorators, type aliases, and docstrings.
- Compact rendering — Structured Markdown (or JSON) optimised for LLM context windows.
- Disk cache — Results are cached on disk and revalidated via
(version, mtime, file_count)to avoid re-parsing unchanged packages.
Install once, use across all projects — no per-project dependency needed:
uv tool install libcontext # CLI only
uv tool install "libcontext[mcp]" # with MCP server (requires Python 3.10+)Update later with uv tool upgrade libcontext.
One-off usage without installing:
uvx --from libcontext libctx inspect requests
If you prefer to add libcontext as a project dependency:
uv add libcontext # basic
uv add libcontext[mcp] # with MCP serverOr with pip:
pip install libcontext
pip install libcontext[mcp]git clone https://github.com/Syclaw/libcontext.git
cd libcontext
uv sync --all-extrasThe install command configures your project for AI-assisted library discovery:
# Claude Code — install the /lib skill
libctx install --skills
# Claude Code — install MCP server config
libctx install --mcp
# VS Code / Cursor — install MCP server config
libctx install --mcp --target vscode
# GitHub Copilot — install the skill
libctx install --skills --target github
# Everything at once
libctx install --all --target all| Flag | What it installs |
|---|---|
--skills |
/lib skill for on-demand API discovery |
--mcp |
MCP server configuration for tool-based access |
--all |
Both skills and MCP |
| Target | Skills location | MCP location |
|---|---|---|
claude (default) |
.claude/skills/lib/SKILL.md |
.mcp.json |
github |
.github/skills/lib/SKILL.md |
— |
vscode |
— | .vscode/mcp.json |
After libctx install --skills, type /lib <package> in Claude Code:
/lib requests → overview, then drill into modules
/lib requests requests.api → jump straight to a specific module
Claude will automatically run libctx commands to discover the API progressively.
After libctx install --mcp, the MCP server provides tools:
get_package_overview— structural overview of a packageget_module_api— detailed API for a single modulesearch_api— search by name or docstring (with optionalkindfilter andformatfor JSON output)get_api_json— full package or single-module API as structured JSONdiff_api— compare two API snapshots and report changes with breaking change detectionrefresh_cache— clear both in-memory and disk caches
from libcontext import collect_package, render_package
# Full API reference
pkg = collect_package("requests")
print(render_package(pkg))from libcontext import collect_package, render_package_overview, render_module, search_package
pkg = collect_package("requests")
# Overview — module names with class/function names
print(render_package_overview(pkg))
# Single module — full signatures and docstrings
for mod in pkg.non_empty_modules:
if mod.name == "requests.api":
print(render_module(mod))
# Search — find specific classes or functions
print(search_package(pkg, "Session"))
# Search with type filter
print(search_package(pkg, "Session", kind="class"))import dataclasses, json
from libcontext import collect_package, diff_packages, render_diff
from libcontext.models import PackageInfo, _serialize_envelope
# JSON serialization (roundtrip-safe)
pkg = collect_package("requests")
data = _serialize_envelope(dataclasses.asdict(pkg))
print(json.dumps(data, indent=2))
# Reconstruct from JSON
pkg_restored = PackageInfo.from_dict(data["data"])
# API diff between versions
old_pkg = PackageInfo.from_dict(old_data)
new_pkg = PackageInfo.from_dict(new_data)
result = diff_packages(old_pkg, new_pkg)
print(render_diff(result))Library authors can customise what libcontext exposes by adding a [tool.libcontext] section to their pyproject.toml. The library does not need to depend on libcontext.
[tool.libcontext]
include_modules = ["mylib.core", "mylib.models"]
exclude_modules = ["mylib._internal", "mylib.tests"]
include_private = false
max_readme_lines = 150
extra_context = """
This library uses the Repository pattern for data access.
All async operations use httpx internally.
"""| Module | Role |
|---|---|
models.py |
Dataclasses for packages, modules, classes, functions, and diff results |
inspector.py |
Static AST analysis — signatures, docstrings, decorators, type aliases |
collector.py |
Package discovery, module collection, stub merging, and disk cache integration |
config.py |
Reads [tool.libcontext] from pyproject.toml |
renderer.py |
LLM-optimised Markdown generation (full, overview, module, search, diff) |
diff.py |
API diff between two package versions with breaking change detection |
cache.py |
Persistent disk cache with mtime/file-count invalidation and LRU eviction |
cli.py |
CLI entry point — inspect, install, diff, and cache subcommands |
mcp_server.py |
MCP server for Claude Code / VS Code / Cursor integration (optional) |
_security.py |
Input sanitisation, path boundary validation, output size guards |
uv sync --all-extras
uv run pytest --cov=libcontext
uv run ruff check src/ tests/
uv run ruff format src/ tests/
uv run mypy src/libcontextSee CONTRIBUTING.md for detailed contribution guidelines.
See DEPENDENCIES.md for the full list of dependencies and their licenses.
MIT — see LICENSE for details.