Model Context Protocol (MCP) architecture showing the flow between user, application code, MCP client/server, and external services
This project is a fork of the Introduction to Model Context Protocol project by Anthropic. The original project was designed specifically for Anthropic's Claude API. This fork has been adapted to work with any LLM that provides an OpenAI-compatible API, enabling you to use self-hosted language models, OpenAI, Anthropic Claude, or other providers instead of being limited to a single service.
Universal MCP Chat is a command-line interface application that enables interactive chat capabilities with AI models. The application supports document retrieval, command-based prompts, and extensible tool integrations via the MCP (Model Control Protocol) architecture.
- Python 3.9+
- Access to a self-hosted LLM with OpenAI-compatible API (such as Ollama, vLLM, text-generation-webui, etc.)
- Copy the example environment file:
cp .env.example .env- Edit the
.envfile and set the following variables for your self-hosted LLM:
LLM_MODEL=your-model-name # Model name/identifier
LLM_API_KEY=your-api-key-here # API key (may not be required for some setups)
LLM_BASE_URL=http://localhost:11434/v1 # Your LLM's OpenAI-compatible endpoint
Ollama (local):
LLM_MODEL=llama2
LLM_API_KEY=any-value-or-empty
LLM_BASE_URL=http://localhost:11434/v1
vLLM Server:
LLM_MODEL=your-model-name
LLM_API_KEY=your-api-key
LLM_BASE_URL=http://your-server:8000/v1
text-generation-webui with OpenAI extension:
LLM_MODEL=your-model-name
LLM_API_KEY=any-value
LLM_BASE_URL=http://your-server:5000/v1
uv is a fast Python package installer and resolver.
- Install uv, if not already installed:
pip install uv- Create and activate a virtual environment:
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate- Install dependencies:
uv pip install -e .- Run the project:
uv run main.py- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate- Install dependencies:
pip install openai python-dotenv prompt-toolkit "mcp[cli]==1.8.0"- Run the project:
python main.py- Tool Calling: The application supports tool calling if your self-hosted LLM supports the OpenAI tools/function calling format
- Model Compatibility: Ensure your LLM supports the features you need (tool calling, system prompts, etc.)
- API Compatibility: Your LLM endpoint must be OpenAI-compatible (most modern self-hosted solutions support this)
Simply type your message and press Enter to chat with the model.
Use the @ symbol followed by a document ID to include document content in your query:
> Tell me about @deposition.md
Use the / prefix to execute commands defined in the MCP server:
> /summarize deposition.md
Commands will auto-complete when you press Tab.
