A versatile Chat Client that supports the Model Context Protocol (MCP), works via CLI and Web UI, and integrates with both local and cloud LLMs.
- Dual Interface: CLI and Web UI (React + Tailwind).
- Local LLM: Built-in support for
llama-cpp-python(auto-downloads TinyLlama). - Cloud LLM: Support for OpenAI, Gemini, and Claude.
- MCP Support: Connect to multiple MCP servers via
mcp.json. - Memory: Persistent conversation history (SQLite).
- Rich UI: Streaming responses, Markdown, Code Highlighting, Mermaid Diagrams.
-
Install Dependencies:
pip install -r requirements.txt cd ui && npm install
-
Configuration:
- Create a
.envfile (optional) for API keys:OPENAI_API_KEY=... GEMINI_API_KEY=... ANTHROPIC_API_KEY=... - Edit
mcp.jsonto add MCP servers.
- Create a
Run the chat interface in the terminal:
python -m cli.main chat- Start the backend server:
python -m cli.main serve
- Start the frontend (in a separate terminal):
cd ui npm run dev - Open http://localhost:5173
To try it out quickly without persisting data:
docker run -p 8000:8000 alexmerced/robust_mcp_clientNote: Models will be re-downloaded and chat history lost when the container stops.
To persist models and history:
docker run -p 8000:8000 \
-v $(pwd)/models:/app/models \
-v $(pwd)/history.db:/app/history.db \
alexmerced/robust_mcp_clientYou can also build the image yourself:
-
Build the image:
docker build -t robust-mcp-client . -
Run the container:
docker run -p 8000:8000 \ -v $(pwd)/models:/app/models \ -v $(pwd)/history.db:/app/history.db \ robust-mcp-client
-p 8000:8000: Expose the server port.-v .../models: Persist downloaded models.-v .../history.db: Persist conversation history.
-
Access the UI: Open http://localhost:8000
- Run tests:
python -m pytest tests/