ContextBridge is a context portability tool for AI conversations. This package publishes the MCP server surface: it compresses an in-progress conversation into a structured context block that preserves the user's goal, confirmed decisions, current state, open threads, artifacts, constraints, recent pivots, and suggested next step, then formats that handoff so another AI can continue without forcing the user to restate everything.
The published npm artifact is the stdio MCP server for Claude Desktop, agent runtimes, and other MCP-compatible hosts. The browser extension runtime is developed in the same repository, but it is not part of this MCP Registry publishing flow.
- npm package: @contextbridge_ai/mcp
- MCP Registry search: io.github.prateekg7/context-bridge
- MCP Registry API: latest version JSON
- Source repository: prateekg7/Context-Bridge
Add this to your Claude Desktop MCP config (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"contextbridge": {
"command": "npx",
"args": ["-y", "@contextbridge_ai/mcp"],
"env": {
"GROQ_API_KEY": "your_groq_api_key_here"
}
}
}
}Or with OpenAI:
{
"mcpServers": {
"contextbridge": {
"command": "npx",
"args": ["-y", "@contextbridge_ai/mcp"],
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"OPENAI_MODEL": "gpt-4.1-mini"
}
}
}
}For a hosted no-key lane instead of BYOK:
{
"mcpServers": {
"contextbridge": {
"command": "npx",
"args": ["-y", "@contextbridge_ai/mcp"],
"env": {
"CONTEXTBRIDGE_HOSTED_URL": "https://your-hosted-endpoint.example/v1",
"CONTEXTBRIDGE_HOSTED_MODEL": "mistralai/Mistral-7B-Instruct-v0.2"
}
}
}
}Compresses a raw transcript into a structured context block.
Parameters:
transcript(string, required): the raw conversation transcript to compress.provider("openai" | "openai-chat" | "google" | "groq" | "custom" | "hosted", optional): provider override.model(string, optional): model override for the selected lane.apiKey(string, optional): BYOK API key override.baseUrl(string, optional): API base URL override.hostedApiKey(string, optional): optional auth key for the hosted lane.
Formats a compressed context block for a destination platform.
Parameters:
context_block(object, required): a validated ContextBridge compressed context block.target("claude" | "chatgpt" | "gemini", required): destination AI platform.
Builds a destination-ready landing package from a compressed context block.
Parameters:
context_block(object, required): a validated ContextBridge compressed context block.target("claude" | "chatgpt" | "gemini", required): destination AI platform.
ContextBridge auto-selects the OpenAI Responses lane when a BYOK key is present.
Environment variables:
OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=gpt-4.1-mini
OPENAI_BASE_URL=https://api.openai.com/v1Additional supported provider overrides:
GOOGLE_API_KEY=your_google_api_key_here
GOOGLE_MODEL=gemini-2.5-pro
GOOGLE_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai
GROQ_API_KEY=your_groq_api_key_here
GROQ_MODEL=llama-3.3-70b-versatile
GROQ_BASE_URL=https://api.groq.com/openai/v1
CONTEXTBRIDGE_API_KEY=your_custom_api_key_here
CONTEXTBRIDGE_MODEL=your_custom_model_here
CONTEXTBRIDGE_BASE_URL=https://your-endpoint.example/v1
CONTEXTBRIDGE_PROVIDER=customThe hosted lane is the no-key path for end users who do not bring their own provider credentials.
Environment variables:
CONTEXTBRIDGE_HOSTED_URL=https://your-hosted-endpoint.example/v1
CONTEXTBRIDGE_HOSTED_MODEL=mistralai/Mistral-7B-Instruct-v0.2
CONTEXTBRIDGE_HOSTED_API_KEY=Selection priority:
- Explicit
apiKeyoption passed into compression uses the OpenAI Responses BYOK lane. OPENAI_API_KEYuses the OpenAI Responses BYOK lane.CONTEXTBRIDGE_HOSTED_URLuses the hosted lane.
Contributions are welcome. The highest-leverage areas are extraction quality evaluation, browser collector resilience, destination formatting quality, hosted-lane infrastructure, and real-world handoff testing across platforms.
MIT. See LICENSE.