This example shows how to implement a Model Context Protocol (MCP) in Python and integrate with different LLM backends:
- OpenAI via SDK v1.x (
OpenAI(api_key).chat.completions.create()) - Local LLaMA via
llama-cpp-python
-
Create a virtual environment and activate:
python3 -m venv venv source venv/bin/activate # Linux/Mac venv\\Scripts\\activate # Windows
-
Install lib
pip install -r requirements.txt
-
Run server
uvicorn server:app --reload
-
Run client
python client.py