A powerful, feature-rich command-line interface for interacting with Model Context Provider servers. This client enables seamless communication with LLMs through integration with the CHUK-MCP protocol library which is a pyodide compatible pure python protocol implementation of MCP, supporting tool usage, conversation management, and multiple operational modes.
The core protocol implementation has been moved to a separate package at: https://github.com/chrishayuk/chuk-mcp
This CLI is built on top of the protocol library, focusing on providing a rich user experience while the protocol library handles the communication layer.
-
Multiple Operational Modes:
- Chat Mode: Conversational interface with direct LLM interaction and automated tool usage
- Interactive Mode: Command-driven interface for direct server operations
- Command Mode: Unix-friendly mode for scriptable automation and pipelines
- Direct Commands: Run individual commands without entering interactive mode
-
Multi-Provider Support:
- OpenAI integration (
gpt-4o-mini
,gpt-4o
,gpt-4-turbo
, etc.) - Ollama integration (
llama3.2
,qwen2.5-coder
, etc.) - Extensible architecture for additional providers
- OpenAI integration (
-
Robust Tool System:
- Automatic discovery of server-provided tools
- Server-aware tool execution
- Tool call history tracking and analysis
- Support for complex, multi-step tool chains
-
Advanced Conversation Management:
- Complete conversation history tracking
- Filtering and viewing specific message ranges
- JSON export capabilities for debugging or analysis
- Conversation compaction for reduced token usage
-
Rich User Experience:
- Command completion with context-aware suggestions
- Colorful, formatted console output
- Progress indicators for long-running operations
- Detailed help and documentation
-
Resilient Resource Management:
- Proper cleanup of asyncio resources
- Graceful error handling
- Clean terminal restoration
- Support for multiple simultaneous server connections
- Python 3.11 or higher
- For OpenAI: Valid API key in
OPENAI_API_KEY
environment variable - For Ollama: Local Ollama installation
- Server configuration file (default:
server_config.json
) - CHUK-MCP protocol library
- Clone the repository:
git clone https://github.com/chrishayuk/mcp-cli
cd mcp-cli
- Install the package with development dependencies:
pip install -e ".[cli,dev]"
- Run the CLI:
mcp-cli --help
If you prefer using UV for dependency management:
# Install UV if not already installed
pip install uv
# Install dependencies
uv sync --reinstall
# Run using UV
uv run mcp-cli --help
Global options available for all modes and commands:
--server
: Specify the server(s) to connect to (comma-separated for multiple)--config-file
: Path to server configuration file (default:server_config.json
)--provider
: LLM provider to use (openai
orollama
, default:openai
)--model
: Specific model to use (provider-dependent defaults)--disable-filesystem
: Disable filesystem access (default: true)
You might encounter a "Missing argument 'KWARGS'" error when running various commands. This is due to how the CLI parser is configured. To resolve this, use one of these approaches:
-
Use the equals sign format for all arguments:
mcp-cli tools call --server=sqlite mcp-cli chat --server=sqlite --provider=ollama --model=llama3.2
-
Add a double-dash (
--
) after the command and before arguments:mcp-cli tools call -- --server sqlite mcp-cli chat -- --server sqlite --provider ollama --model llama3.2
These format issues apply to all commands (chat, interactive, tools, etc.) and are due to how the argument parser interprets positional vs. named arguments.
Chat mode provides a natural language interface for interacting with LLMs, where the model can automatically use available tools:
# Default (makes chat the default when no other command is specified)
uv run mcp-cli
# Explicit chat mode
uv run mcp-cli chat --server sqlite
Interactive mode provides a command-driven shell interface for direct server operations:
uv run mcp-cli interactive --server sqlite
Command mode provides a Unix-friendly interface for automation and pipeline integration:
uv run mcp-cli cmd --server sqlite [options]
Run individual commands without entering an interactive mode:
# List available tools
uv run mcp-cli tools list {} --server sqlite
# Call a specific tool
uv run mcp-cli tools call {} --server sqlite
Chat mode provides a conversational interface with the LLM, automatically using available tools when needed.
# Default with {} for KWARGS
uv run mcp-cli --server sqlite
# Explicit chat mode with {}
uv run mcp-cli chat --server sqlite
# With specific provider and model
uv run mcp-cli chat --server sqlite --provider openai --model gpt-4o
With specific provider and model:
uv run mcp-cli chat --server sqlite --provider openai --model gpt-4o
# Note: Be careful with the command syntax
# Correct format without any KWARGS parameter
uv run mcp-cli chat --server sqlite --provider ollama --model llama3.2
# Or if you encounter the "Missing argument 'KWARGS'" error, try:
uv run mcp-cli chat --server=sqlite --provider=ollama --model=llama3.2
In chat mode, use these slash commands:
/help
: Show available commands/help <command>
: Show detailed help for a specific command/quickhelp
or/qh
: Display a quick reference of common commandsexit
orquit
: Exit chat mode
/tools
: Display all available tools with their server information/tools --all
: Show detailed tool information including parameters/tools --raw
: Show raw tool definitions
/toolhistory
or/th
: Show history of tool calls in the current session/th <N>
: Show details for a specific tool call/th -n 5
: Show only the last 5 tool calls/th --json
: Show tool calls in JSON format
/conversation
or/ch
: Show the conversation history/ch <N>
: Show a specific message from history/ch -n 5
: Show only the last 5 messages/ch <N> --json
: Show a specific message in JSON format/ch --json
: View the entire conversation history in raw JSON format
/save <filename>
: Save conversation history to a JSON file/compact
: Condense conversation history into a summary
/cls
: Clear the screen while keeping conversation history/clear
: Clear both the screen and conversation history/verbose
or/v
: Toggle between verbose and compact tool display modes
/interrupt
,/stop
, or/cancel
: Interrupt running tool execution/provider <name>
: Change the current LLM provider/model <name>
: Change the current LLM model/servers
: List connected servers and their status
Interactive mode provides a command-driven shell interface for direct server interaction.
# Using {} to satisfy KWARGS requirement
mcp-cli interactive {} --server sqlite
In interactive mode, use these commands:
help
: Show available commandsexit
orquit
orq
: Exit interactive modeclear
orcls
: Clear the terminal screenservers
orsrv
: List connected servers with their statustools
ort
: List available tools or call one interactivelytools --all
: Show detailed tool informationtools --raw
: Show raw JSON definitionstools call
: Launch the interactive tool-call UI
resources
orres
: List available resources from all serversprompts
orp
: List available prompts from all serversping
: Ping connected servers (optionally filter by index/name)
Command mode provides a Unix-friendly interface for automation and pipeline integration.
# Using {} to satisfy KWARGS requirement
mcp-cli cmd {} --server sqlite [options]
--input
: Input file path (use-
for stdin)--output
: Output file path (use-
for stdout, default)--prompt
: Prompt template (use{{input}}
as placeholder for input)--raw
: Output raw text without formatting--tool
: Directly call a specific tool--tool-args
: JSON arguments for tool call--system-prompt
: Custom system prompt--verbose
: Enable verbose logging
Process content with LLM:
# Summarize a document (with {} for KWARGS)
uv run mcp-cli cmd --server sqlite --input document.md --prompt "Summarize this: {{input}}" --output summary.md
# Process stdin and output to stdout
cat document.md | mcp-cli cmd {} --server sqlite --input - --prompt "Extract key points: {{input}}"
Call tools directly:
# List database tables
uv run mcp-cli cmd {} --server sqlite --tool list_tables --raw
# Run a SQL query
uv run mcp-cli cmd {} --server sqlite --tool read_query --tool-args '{"query": "SELECT COUNT(*) FROM users"}'
Batch processing:
# Process multiple files with GNU Parallel
ls *.md | parallel mcp-cli cmd --server sqlite --input {} --output {}.summary.md --prompt "Summarize: {{input}}"
Run individual commands without entering interactive mode:
# List all tools (using {} to satisfy KWARGS requirement)
uv run mcp-cli tools list {} --server sqlite
# Show detailed tool information
uv run mcp-cli tools list {} --server sqlite --all
# Show raw tool definitions
uv run mcp-cli tools list {} --server sqlite --raw
# Call a specific tool interactively
uv run mcp-cli tools call {} --server sqlite
# List available resources
uv run mcp-cli resources list {} --server sqlite
# List available prompts
uv run mcp-cli prompts list {} --server sqlite
# Ping all servers
uv run mcp-cli ping {} --server sqlite
# Ping specific server(s)
uv run mcp-cli ping {} --server sqlite,another-server
Create a server_config.json
file with your server configurations:
{
"mcpServers": {
"sqlite": {
"command": "python",
"args": ["-m", "mcp_server.sqlite_server"],
"env": {
"DATABASE_PATH": "your_database.db"
}
},
"another-server": {
"command": "python",
"args": ["-m", "another_server_module"],
"env": {}
}
}
}
In chat mode, simply ask questions that require tool usage, and the LLM will automatically call the appropriate tools:
You: What tables are available in the database?
[Tool Call: list_tables]
Assistant: There's one table in the database named products. How would you like to proceed?
You: Select top 10 products ordered by price in descending order
[Tool Call: read_query]
Assistant: Here are the top 10 products ordered by price in descending order:
1 Mini Drone - $299.99
2 Smart Watch - $199.99
3 Portable SSD - $179.99
...
You can change providers or models during a chat session:
> /provider
Current provider: openai
To change provider: /provider <provider_name>
> /model
Current model: gpt-4o-mini
To change model: /model <model_name>
> /model gpt-4o
Switched to model: gpt-4o
The MCP CLI provides powerful conversation history management:
> /conversation
Conversation History (12 messages)
# | Role | Content
1 | system | You are an intelligent assistant capable of using t...
2 | user | What tables are available in the database?
3 | assistant | Let me check for you.
...
> /save conversation.json
Conversation saved to conversation.json
> /compact
Conversation history compacted with summary.
The CLI is organized with optional dependency groups:
- cli: Rich terminal UI, command completion, and provider integrations
- dev: Development tools and testing utilities
- wasm: (Reserved for future WebAssembly support)
- chuk-mcp: Protocol implementation library (core dependency)
Install with specific extras using:
pip install "mcp-cli[cli]" # Basic CLI features
pip install "mcp-cli[cli,dev]" # CLI with development tools
Contributions are welcome! Please follow these steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Rich for beautiful terminal formatting
- Typer for CLI argument parsing
- Prompt Toolkit for interactive input
- CHUK-MCP for the core protocol implementation