An MCP (Model Context Protocol) server that provides seamless integration with OpenCode, the open-source AI coding agent for the terminal.
- Execute OpenCode Commands: Run any OpenCode CLI command programmatically
- Session Management: Create, continue, and export coding sessions
- Model Discovery: List available AI models from all configured providers
- Async Execution: Non-blocking command execution with timeout handling
- JSON Lines Parsing: Robust parsing of OpenCode's streaming output format
| Tool | Description |
|---|---|
execute_opencode_command |
Execute any OpenCode CLI command with full flexibility |
opencode_run |
Run OpenCode with a simple prompt message |
opencode_continue_session |
Continue an existing OpenCode session |
opencode_list_models |
List available models, optionally filtered by provider |
opencode_export_session |
Export session data as JSON |
opencode_get_status |
Check OpenCode CLI availability and status |
- Python 3.10+
- OpenCode CLI installed and configured
- MCP-compatible client (Claude Desktop, etc.)
pip install -r requirements.txtAdd to your MCP client configuration (e.g., ~/.claude.json or Claude Desktop settings):
{
"mcpServers": {
"opencode": {
"command": "python",
"args": ["-m", "src.services.fast_mcp.opencode_server"],
"cwd": "/path/to/opencode-mcp"
}
}
}Once configured, the MCP tools are available through your MCP client:
# Run a coding task
opencode_run(message="Create a Python function that calculates fibonacci numbers")
# List available models
opencode_list_models(provider="anthropic")
# Continue a previous session
opencode_continue_session(session_id="abc123", message="Now add unit tests")
# Check status
opencode_get_status()
{
"prompt": str, # Required: The prompt/task for OpenCode
"model": str, # Optional: Model in provider/model format (e.g., "anthropic/claude-sonnet-4-20250514")
"agent": str, # Optional: Agent to use (e.g., "build", "plan")
"session": str, # Optional: Session ID to continue
"continue_session": bool, # Optional: Whether to continue last session
"timeout": int # Optional: Timeout in seconds (default: 300, max: 600)
}{
"message": str, # Required: Message/prompt to send
"model": str, # Optional: Model to use
"agent": str, # Optional: Agent to use
"files": [str], # Optional: Files to attach
"timeout": int # Optional: Timeout in seconds
}{
"session_id": str, # Required: Session ID to continue
"message": str, # Optional: Follow-up message
"timeout": int # Optional: Timeout in seconds
}Environment variables (prefix: OPENCODE_):
| Variable | Default | Description |
|---|---|---|
OPENCODE_COMMAND |
opencode |
Path to OpenCode CLI |
OPENCODE_DEFAULT_MODEL |
None | Default model to use |
OPENCODE_DEFAULT_AGENT |
None | Default agent to use |
OPENCODE_DEFAULT_TIMEOUT |
300 |
Default timeout (seconds) |
OPENCODE_MAX_TIMEOUT |
600 |
Maximum timeout (seconds) |
OPENCODE_SERVER_LOG_LEVEL |
INFO |
Logging level |
src/services/fast_mcp/opencode_server/
├── __init__.py
├── __main__.py # Entry point
├── server.py # MCP server & tool definitions
├── opencode_executor.py # CLI execution wrapper
├── models.py # Pydantic models
├── settings.py # Configuration
└── handlers/
├── __init__.py
├── execution.py # Run/continue operations
├── session.py # Session management
└── discovery.py # Model/status discovery
pytest tests/ -vblack src/
ruff check src/Planned features for v2.0:
-
opencode_import_session- Import sessions from JSON/URL -
opencode_list_sessions- List all sessions with filtering -
opencode_get_stats- Usage statistics and cost tracking -
opencode_list_agents- List available agents -
opencode_github_run- GitHub Actions integration (async) -
opencode_pr_checkout- PR workflow support
Contributions are welcome! Please read the contributing guidelines before submitting PRs.
MIT License - see LICENSE for details.
- OpenCode - The AI coding agent this server integrates with
- Model Context Protocol - The protocol specification
- MCP SDK - Python SDK for MCP