This project demonstrates a comprehensive implementation of the Model Context Protocol (MCP), which allows AI models to communicate with external tools through a standardized interface.
MCP is an open protocol that standardizes how applications provide context to large language models (LLMs). It enables communication between AI systems and locally running MCP servers that provide additional tools and resources to extend AI capabilities.
server_SGL.py
- Key-Value store server (Set, Get, List operations)server_CALC.py
- Calculator server (add, subtract, multiply, divide, sqrt)server_WEATHER.py
- Weather information server (current weather, forecast, cities)
multi_server_client.py
- Main client that connects to multiple MCP servers and uses Groq API
start_servers.bat
- Batch file to start all servers on different ports.env
- Environment file for API keys (not included in repository)requirements.txt
- Dependencies for Groq integration
-
Set up the virtual environment:
# Create virtual environment python -m venv venv # Activate the virtual environment venv\Scripts\activate # On Windows # source venv/bin/activate # On macOS/Linux # Install dependencies pip install -r requirements.txt
-
Create a
.env
file with your Groq API key:GROQ_API_KEY=your_api_key_here
-
Start the MCP servers (each on a different port):
# On Windows .\start_servers.bat # Or start them individually python server_SGL.py python server_CALC.py 8002 python server_WEATHER.py 8003
-
Run the multi-server client:
python multi_server_client.py
keyvalue___set
: Store a value with a keykeyvalue___get
: Retrieve a value by keykeyvalue___list
: List all stored keys
calc___add
: Add two numberscalc___subtract
: Subtract second number from firstcalc___multiply
: Multiply two numberscalc___divide
: Divide first number by secondcalc___sqrt
: Calculate square root of a number
weather___current
: Get current weather for a city (simulated)weather___forecast
: Get weather forecast for a city (simulated)weather___cities
: List available cities
Each MCP server implements two main endpoints:
/mcp/tools
- Returns a list of available tools/mcp/invoke
- Executes a tool with provided parameters
This project demonstrates integration with Groq's LLM API, but the MCP protocol can be used with any LLM that supports function calling, including:
- OpenAI (GPT models)
- Anthropic Claude
- Amazon Bedrock models
- Local models via tools like Ollama
This project is provided as an educational example of the MCP protocol.