A Model Context Protocol (MCP) server built with Python that serves agenda data for an event.
This project is also a reusable template for building MCP servers.
- Learn to build MCP servers with Python.
- Create a template project that can be reused for other MCP servers.
- Provide a step-by-step workshop for the attendees.
- Build a demo MCP server that answers questions about the event agenda.
├── data/
│ ├── sample_database.db # SQLite database with sample agenda data
│ └── sample_talks.json # Sample agenda data in JSON
├── documentation/
│ ├── tools.md # What are tools and how they work
│ ├── workshop.md # Step-by-step workshop instructions (English)
│ └── workshop_es.md # Step-by-step workshop instructions (Spanish)
├── src/
│ ├── tools_client.py # LLM client with tool calling
│ ├── config.py # Configuration (Ollama URL, model, etc.)
│ ├── tools.py # Tool functions, schemas, and execution
│ ├── MCP_server.py # MCP server exposing the agenda tools
│ └── telegram_bot.py # Telegram bot frontend
├── tests/
│ ├── test_tools_client.py # Client tests
│ └── test_tools.py # Tool function tests
├── .env.sample # Environment variables template
└── .mcp.json # MCP server config for Claude Code
python3 -m venv .venv
source .venv/bin/activate
pip install --upgrade pip
pip install pytest python-dotenv langchain_core langchain_ollama "mcp[cli]" python-telegram-botCopy .env.sample to .env:
cp .env.sample .envYou have two options for the LLM backend:
Create an account at ollama.com and get an API key.
Fill your .env:
OLLAMA_URL=https://ollama.com
OLLAMA_API_KEY=<your API key>
Note: Other providers (OpenAI, Anthropic, etc.) require small changes in src/tools_client.py.
Check Langchain documentation for chat models integrations.
Install Ollama and pull a model:
sudo apt install curl
curl -fsSL https://ollama.com/install.sh | sh
ollama pull gemma4Fill your .env:
OLLAMA_URL=http://localhost:11434
OLLAMA_API_KEY=
Note: You can use a smaller model instead of
gemma4. Browse the available models with tool support.
To use a different model, pull it withollama pull <model>and update the model name in src/config.py.
The MCP server exposes the agenda tools over the Model Context Protocol so any MCP-compatible client can use them.
Run it from the project root:
source .venv/bin/activate
python -m src.MCP_serversource .venv/bin/activate
python -m src.tools_clientCreate a bot with @BotFather on Telegram and copy the token
it gives you into your .env:
TELEGRAM_TOKEN=<your bot token>
Run the bot (and the MCP server) from the project root:
source .venv/bin/activate
python -m src.MCP_server & python -m src.telegram_botThis starts the MCP server in the background and then runs the Telegram bot in the foreground.
Then open your bot in Telegram and send /start.