Skip to content

crestaa/Universal-MCP-Chat

Repository files navigation

Universal MCP Chat

MCP Architecture

Model Context Protocol (MCP) architecture showing the flow between user, application code, MCP client/server, and external services

Overview

This project is a fork of the Introduction to Model Context Protocol project by Anthropic. The original project was designed specifically for Anthropic's Claude API. This fork has been adapted to work with any LLM that provides an OpenAI-compatible API, enabling you to use self-hosted language models, OpenAI, Anthropic Claude, or other providers instead of being limited to a single service.

Universal MCP Chat is a command-line interface application that enables interactive chat capabilities with AI models. The application supports document retrieval, command-based prompts, and extensible tool integrations via the MCP (Model Control Protocol) architecture.

Prerequisites

  • Python 3.9+
  • Access to a self-hosted LLM with OpenAI-compatible API (such as Ollama, vLLM, text-generation-webui, etc.)

Setup

Step 1: Configure the environment variables

  1. Copy the example environment file:
cp .env.example .env
  1. Edit the .env file and set the following variables for your self-hosted LLM:
LLM_MODEL=your-model-name                    # Model name/identifier
LLM_API_KEY=your-api-key-here               # API key (may not be required for some setups)
LLM_BASE_URL=http://localhost:11434/v1      # Your LLM's OpenAI-compatible endpoint

Common Self-Hosted LLM Configurations

Ollama (local):

LLM_MODEL=llama2
LLM_API_KEY=any-value-or-empty
LLM_BASE_URL=http://localhost:11434/v1

vLLM Server:

LLM_MODEL=your-model-name
LLM_API_KEY=your-api-key
LLM_BASE_URL=http://your-server:8000/v1

text-generation-webui with OpenAI extension:

LLM_MODEL=your-model-name
LLM_API_KEY=any-value
LLM_BASE_URL=http://your-server:5000/v1

Step 2: Install dependencies

Option 1: Setup with uv (Recommended)

uv is a fast Python package installer and resolver.

  1. Install uv, if not already installed:
pip install uv
  1. Create and activate a virtual environment:
uv venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
  1. Install dependencies:
uv pip install -e .
  1. Run the project:
uv run main.py

Option 2: Setup without uv

  1. Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
  1. Install dependencies:
pip install openai python-dotenv prompt-toolkit "mcp[cli]==1.8.0"
  1. Run the project:
python main.py

Important Notes

  • Tool Calling: The application supports tool calling if your self-hosted LLM supports the OpenAI tools/function calling format
  • Model Compatibility: Ensure your LLM supports the features you need (tool calling, system prompts, etc.)
  • API Compatibility: Your LLM endpoint must be OpenAI-compatible (most modern self-hosted solutions support this)

Usage

Basic Interaction

Simply type your message and press Enter to chat with the model.

Document Retrieval

Use the @ symbol followed by a document ID to include document content in your query:

> Tell me about @deposition.md

Commands

Use the / prefix to execute commands defined in the MCP server:

> /summarize deposition.md

Commands will auto-complete when you press Tab.

About

Model Context Protocol mini template, adapted to work with any LLM that provides OpenAI-compatible API.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages