The Universal Standard for Context Engineering.
context-pipe is a high-performance orchestration layer designed to bring the Unix Philosophy to the AI context window. It allows you to connect AI tools (Spokes) into a series of Streams, ensuring that data is refined, distilled, and noise-free before it ever reaches the LLM.
In the "Studio of Two" philosophy, we build Systems, not Patches. context-pipe is the system that manages the flow of context, allowing you to chain specialized tools (Refineries) like semantic-sift into your agentic workflows with zero token overhead and millisecond latency.
A language-agnostic standard based on stdin and stdout. If a tool can read text and emit text, it can be a node in the pipe.
A lightweight orchestrator that manages multi-node data streams (e.g., [Ingest] -> [Mask] -> [Rerank] -> [Distill]).
Universal hooks that automatically apply your context pipes to any MCP tool call in IDEs like Cursor, VS Code, and Windsurf. For OpenCode, the AGENTS.md SOP mandate (pipe_read_file for all file reads) is the active strategy until transparent plugin interception is supported upstream.
Option A: Quick Install (PyPI)
Because MCP servers require an explicit Python executable path in your IDE config, you must create a virtual environment first:
ℹ️ What you get: This installs the Context-Pipe orchestration layer and Semantic-Sift's core Python server. The
sift-coreRust binary (for near-instant heuristic sifting) is included in the PyPI wheel — no Rust toolchain required. The[neural]extra adds PyTorch (~1.5 GB) for large-payload semantic compression.
uv venv
# Windows: .\.venv\Scripts\activate
# macOS/Linux: source .venv/bin/activate
uv pip install mcp-context-pipe "semantic-sift[neural,multi-modal]"Option B: Sovereign Pattern (Recommended for Studio of Two)
Clone both repos side-by-side. The context-pipe venv acts as the master environment holding both packages. See Section 0 of the Operator's Guide for the full sequence.
# 1. Clone both repos
git clone https://github.com/luismichio/context-pipe.git
git clone https://github.com/luismichio/semantic-sift.git
# 2. Master venv in context-pipe - holds both packages
cd context-pipe
python3.12 -m venv venv
# Windows:
.\venv\Scripts\activate
# macOS/Linux:
# source venv/bin/activate
uv pip install -e .
uv pip install -e ../semantic-sift # semantic-sift-cli lands in context-pipe/venv/Scripts/ (Win) or venv/bin/ (Mac/Linux)
# 3. ML runtime venv in semantic-sift (Python 3.12 for torch/CUDA compatibility)
cd ../semantic-sift
python3.12 -m venv venv312
# Windows:
.\venv312\Scripts\activate
# macOS/Linux:
# source venv312/bin/activate
uv pip install -e .[neural] # torch, transformers, llmlinguaNote: The package name on PyPI is
mcp-context-pipebut the installed module iscontext_pipe. Thesemantic-sift-clibinary is registered only in the venv wheresemantic-siftis pip-installed (step 2 above). Bothpipes.jsonfiles must reference that absolute path.
CRITICAL: For exact configuration paths for Cursor, Gemini, OpenCode, VS Code, and Claude, reference the Master Configuration Matrix.
Context-Pipe is the "Switchboard," but it needs a "Refinery" to distill data. Semantic-Sift is the flagship intelligence engine for this ecosystem. It uses heuristic sieves and neural models (BERT/ONNX) to incinerate noise (timestamps, boilerplate) while preserving 95% of the signal.
Note: In the Sovereign Pattern,
semantic-siftis cross-installed intocontext-pipe/venv(step 2 above). Context-Pipe will also auto-discover a separately installedsemantic-sift-cliacross all known locations (system PATH, pipx, sibling venv directories) viapipe_onboardorpipe_verify.
After installing both packages, ask your AI assistant to verify the full stack:
"Run
pipe_verify()to confirm the installation."
This will report the health of every component and automatically link semantic-sift-cli into pipes.json if it was found in a separate environment.
Edit pipes.json (see pipes.json.example) to define your high-fidelity context streams.
Once connected, ask your AI Assistant to configure your workspace:
"Run
pipe_onboard(environment='Cursor')to configure this project."
Detailed documentation is available in the doc/ directory.
- doc/INDEX.md: The navigational roadmap for the documentation ecosystem.
- doc/USE_CASES.md: Real-world, high-impact scenarios demonstrating how to chain Bash, Skills, and Semantic-Sift.
- doc/OPERATOR_GUIDE.md: Definitive guide for setup, terminal mastery, and
pipes.jsonconfiguration. - doc/ARCHITECTURE.md: Technical specifications of the orchestration spine and switchboard.
- doc/CONTEXT_PIPE_PROTOCOL.md: The language-agnostic standard for tool interoperability.
- doc/INTEGRATION_ENCYCLOPEDIA.md: Master Compatibility Matrix for Cursor, VS Code, Gemini, and Claude.
Context-Pipe follows the Unix Philosophy. You can use it as a standalone utility or inside existing bash chains.
# Sift a log file through the 'standard-distill' pipe
cat app.log | context-pipe run standard-distill
# Process a document through a multi-node refinery
cat spec.pdf | context-pipe run full-refinery > distilled_spec.md
# Pre-distill code for manual copy-pasting
cat server.py | context-pipe run semantic-refinery | clipContext-Pipe supports more than just simple binaries. You can chain standard OS tools and expert mandates.
Execute arbitrary shell commands as part of your pipe.
{ "cmd": "grep 'ERROR'", "shell": true }Apply an "Expert Lens" to the context by injecting specialized skill mandates.
{ "cmd": "context-pipe-skill", "args": ["security-auditor"] }Context-Pipe is a foundational member of the Studio of Two infrastructure. It is designed to work in high-fidelity harmony with:
- Semantic-Sift: The intelligent refinery for agentic context. Sift is the flagship distillation engine for Context-Pipe, providing the mathematical and neural sifting nodes used in our standard templates.
context-pipe is licensed under the Apache License 2.0. It is an "Open Source, Closed Contribution" project maintained by the Studio of Two to ensure architectural integrity.
Building High-Fidelity Infrastructure for the Intelligence Age.