A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP). Currently supports both Claude 3.5 Sonnet and Ollama models.
MCPHost acts as a host in the MCP client-server architecture, where:
- Hosts (like MCPHost) are LLM applications that manage connections and interactions
- Clients maintain 1:1 connections with MCP servers
- Servers provide context, tools, and capabilities to the LLMs
This architecture allows language models to:
- Access external tools and data sources π οΈ
- Maintain consistent context across interactions π
- Execute commands and retrieve information safely π
Currently supports:
- Claude 3.5 Sonnet (claude-3-5-sonnet-20240620)
- Any Ollama-compatible model with function calling support
- Interactive conversations with either Claude 3.5 Sonnet or Ollama models
- Support for multiple concurrent MCP servers
- Dynamic tool discovery and integration
- Tool calling capabilities for both model types
- Configurable MCP server locations and arguments
- Consistent command interface across model types
- Configurable message history window for context management
- Go 1.23.3 or later
- For Claude: An Anthropic API key
- For Ollama: Local Ollama installation with desired models
- One or more MCP-compatible tool servers
- Anthropic API Key (for Claude):
export ANTHROPIC_API_KEY='your-api-key'
- Ollama Setup:
- Install Ollama from https://ollama.ai
- Pull your desired model:
ollama pull mistral
- Ensure Ollama is running:
ollama serve
go install github.com/mark3labs/mcphost@latest
You can download pre-built binaries for your platform from the releases page.
-
Download the appropriate archive for your platform:
- Linux:
mcphost_Linux_[arch].tar.gz
- macOS:
mcphost_Darwin_[arch].tar.gz
- Windows:
mcphost_Windows_[arch].zip
Where
[arch]
is eitherx86_64
(for Intel/AMD) orarm64
(for ARM-based processors). - Linux:
-
Extract the archive:
# For Linux/macOS tar xzf mcphost_[OS]_[arch].tar.gz # For Windows # Extract the zip file using your preferred tool
-
Move the binary to a directory in your PATH:
# For Linux/macOS sudo mv mcphost /usr/local/bin/ # For Windows # Move mcphost.exe to a directory in your PATH
Each release includes a checksums.txt
file containing SHA256 checksums for all release artifacts. To verify your download:
- Download both the binary and
checksums.txt
- Run:
# On Linux/macOS sha256sum --check checksums.txt # On Windows (PowerShell) Get-FileHash mcphost_Windows_[arch].zip | Format-List # Compare the hash with the one in checksums.txt
MCPHost will automatically create a configuration file at ~/.mcp.json
if it doesn't exist. You can also specify a custom location using the --config
flag:
{
"mcpServers": {
"sqlite": {
"command": "uvx",
"args": [
"mcp-server-sqlite",
"--db-path",
"/tmp/foo.db"
]
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/tmp"
]
}
}
}
Each MCP server entry requires:
command
: The command to run (e.g.,uvx
,npx
)args
: Array of arguments for the command:- For SQLite server:
mcp-server-sqlite
with database path - For filesystem server:
@modelcontextprotocol/server-filesystem
with directory path
- For SQLite server:
- Using Claude 3.5 Sonnet:
mcphost
- Using Ollama:
mcphost ollama --model mistral
Note: Tool support in Ollama requires models that support function calling.
- Custom config file:
mcphost --config /path/to/config.json
- Set message history window:
mcphost --message-window 15
While chatting, you can use:
/help
: Show available commands/tools
: List all available tools/servers
: List configured MCP servers/history
: Display conversation history/quit
: Exit the applicationCtrl+C
: Exit at any time
--config
: Specify custom config file location--message-window
: Set number of messages to keep in context (default: 10)
MCPHost can work with any MCP-compliant server. For examples and reference implementations, see the MCP Servers Repository.
Contributions are welcome! Feel free to:
- Submit bug reports or feature requests through issues
- Create pull requests for improvements
- Share your custom MCP servers
- Improve documentation
Please ensure your contributions follow good coding practices and include appropriate tests.
This project is licensed under the MIT License - see the LICENSE file for details.
- Thanks to the Anthropic team for Claude and the MCP specification
- Thanks to the Ollama team for their local LLM runtime
- Thanks to all contributors who have helped improve this tool