A complete demonstration of the Model Context Protocol (MCP) showcasing how to securely bridge Large Language Models with real-world APIs.
When integrating LLMs with internal APIs, you face three critical problems:
- π The Language Barrier: LLMs speak natural language, APIs speak HTTP+JSON
- π The Security Dilemma: How to give LLMs API access without exposing credentials
- πͺ The Orchestration Burden: Where does the business logic live?
This demo solves all three elegantly using MCP.
βββββββββββββββ βββββββββββββββ βββββββββββββββ βββββββββββββββ
β Usuario βββββββΆβ LLM βββββββΆβ MCP Server βββββββΆβ Task API β
β (EspaΓ±ol) ββββββββ (Reasoning)ββββββββ (Bridge) ββββββββ (FastAPI) β
βββββββββββββββ βββββββββββββββ βββββββββββββββ βββββββββββββββ
β β β
Natural Language API Key (Secure) Stores in S3
Understanding Translation Layer JSON Files
- Task API (FastAPI): RESTful API for managing users, calls, and tasks
- MCP Server (FastMCP): Secure bridge exposing API as LLM tools with modern Python decorators
- Demo Client (Streamlit): Interactive web app showcasing the integration
- S3 Storage: Simple, persistent data layer (LocalStack for local dev)
- β FastMCP Framework: Decorator-based MCP server with automatic schema generation
- β Secure Credential Isolation: API keys never exposed to LLM
- β Natural Language Interface: Spanish/English commands β API calls
- β Multi-step Orchestration: Complex workflows handled intelligently
- β Complete CRUD Operations: Users, scheduled calls, and tasks
- β Production-Ready: FastAPI + AWS App Runner + S3
- β Interactive Demo: Streamlit web app with visual scenarios
- β MCP Inspector: Debug and test tools interactively
- β Docker Support: Full containerized deployment
- Python 3.12+
- Docker & Docker Compose (for containerized setup)
- GitHub API Key (for LLM features)
- More details
# Clone repository
git clone https://github.com/ariosramirez/mcp-pycon.git
cd mcp-pycon
# Set GitHub API key for LLM features
export GITHUB_API_KEY=your-github-token-here
# Start all services
docker compose up -d
# View logs
docker compose logs -fServices available at:
- Task API: http://localhost:8000 (docs)
- MCP Server: http://localhost:8001
- Streamlit Demo: http://localhost:8501
- LocalStack S3: http://localhost:4566
# Install dependencies
uv sync
# OR
pip install -e .
# Configure environment
cp .env.example .env
# Edit .env with your credentials:
# - TASK_API_KEY
# - GITHUB_API_KEY
# - AWS configuration (for LocalStack: AWS_ENDPOINT_URL=http://localhost:4566)Start services in separate terminals:
# Terminal 1: Start Task API (with LocalStack)
cd task_api
docker compose up -d
# OR run directly
python -m task_api.main
# Terminal 2: Start MCP Server
python -m mcp_server.server
# Terminal 3: Start Streamlit Demo
streamlit run demo_client/streamlit_app.pyTest and debug MCP tools interactively:
# Make sure Task API is running
npx @modelcontextprotocol/inspector fastmcp run mcp_server/server.py:mcpOpen http://localhost:5173 to:
- View all available tools with schemas
- Test tool calls with custom parameters
- Inspect request/response payloads
- Debug in real-time
Example: Test register_user tool
{
"name": "Test User",
"email": "test@example.com",
"company": "Test Corp"
}The Streamlit demo showcases three scenarios demonstrating MCP's capabilities:
"Por favor, registra a nuestro nuevo cliente 'Azollon International' con el contacto MarΓa GarcΓa (maria@test-azollon.com) y agΓ©ndale una llamada de onboarding para este viernes a las 10am."
Demonstrates: Multi-step orchestration, secure API calls
"MuΓ©strame todas las llamadas pendientes y marca la primera como completada."
Demonstrates: Data retrieval, intelligent processing, updates
"Crea una tarea de seguimiento para todos los clientes que tienen llamadas programadas esta semana."
Demonstrates: Complex reasoning, data aggregation, orchestration
FastMCP reduces boilerplate by 60% compared to traditional MCP SDK:
Traditional MCP SDK:
@app.list_tools()
async def list_tools() -> list[Tool]:
return [Tool(name="...", inputSchema={...})] # Manual JSON schema
@app.call_tool()
async def call_tool(name: str, arguments: Any):
if name == "register_user": # Manual routing
return [TextContent(type="text", text="...")]FastMCP:
@mcp.tool
async def register_user(
name: Annotated[str, "Full name of the user"],
email: Annotated[str, "Email address"],
) -> str:
"""Register a new user."""
return "β
User registered!" # Auto-wrapped!Benefits:
- Automatic schema generation from type hints
- Built-in parameter validation (Pydantic)
- Type-safe with modern Python features
- Cleaner error handling with
ToolError
POST /users- Register userGET /users- List all usersGET /users/{user_id}- Get user details
POST /calls- Schedule callGET /calls- List calls (filterable by user_id, status_filter)PATCH /calls/{call_id}/status- Update status
POST /tasks- Create taskGET /tasks- List tasks (filterable by user_id, status_filter)PATCH /tasks/{task_id}/status- Update status
All endpoints require X-API-Key header (except /health):
curl -X POST http://localhost:8000/users \
-H "X-API-Key: demo-secret-key-change-in-production" \
-H "Content-Type: application/json" \
-d '{
"name": "MarΓa GarcΓa",
"email": "maria@test-azollon.com",
"company": "Azollon International"
}'Default key: demo-secret-key-change-in-production
curl http://localhost:8000/healthimport httpx
client = httpx.Client(
base_url="http://localhost:8000",
headers={"X-API-Key": "demo-secret-key-change-in-production"}
)
# Create user
response = client.post("/users", json={
"name": "MarΓa GarcΓa",
"email": "maria@test-azollon.com",
"company": "Azollon International"
})
user = response.json()
# Schedule call
response = client.post("/calls", json={
"user_id": user['id'],
"title": "Onboarding Call",
"scheduled_for": "2025-10-20T10:00:00Z",
"duration_minutes": 30
})- API Key Isolation: Keys live ONLY in MCP Server environment, never exposed to LLM
- AWS Secrets Manager: Use for production credential management
- HTTPS/TLS: Always enable in production
- Key Rotation: Implement regular rotation policy
- Access Logging: Monitor all API calls and tool uses
- Least Privilege: Grant minimal S3 permissions
mcp-pycon-demo/
βββ task_api/ # FastAPI application
β βββ main.py # API endpoints
β βββ models.py # Pydantic models
β βββ storage.py # S3 storage layer
β βββ Dockerfile # Container image
β βββ README.md # API documentation
βββ mcp_server/ # MCP Server
β βββ server.py # FastMCP implementation
β βββ README.md # MCP server documentation
βββ demo_client/ # Streamlit Demo
β βββ streamlit_app.py # Web UI
β βββ langgraph_agent.py # LLM agent with LangGraph
β βββ azure_chat_wrapper.py # GitHub Models integration
βββ docker-compose.yml # Container orchestration
βββ pyproject.toml # Project metadata
βββ README.md # This file
In task_api/main.py:
@app.post("/your-endpoint")
async def your_endpoint(
data: YourModel,
api_key: str = Header(..., alias="X-API-Key")
):
verify_api_key(api_key)
return {"result": "success"}from typing import Annotated, Literal
from fastmcp.exceptions import ToolError
from pydantic import Field
@mcp.tool
async def your_new_tool(
param: Annotated[str, "Parameter description"],
count: Annotated[int, Field(ge=1, le=100)] = 10,
status: Annotated[Literal["active", "inactive"] | None, "Filter"] = None
) -> str:
"""Tool description for LLM."""
client = get_http_client()
try:
response = await client.post("/your-endpoint", json={"param": param})
response.raise_for_status()
return f"β
Success: {response.json()}"
except httpx.HTTPStatusError as e:
raise ToolError(f"Failed: {e.response.json().get('message')}")FastMCP handles schema generation, validation, and error formatting automatically!
# Start all services
docker compose up -d
# View logs (all)
docker compose logs -f
# View logs (specific service)
docker compose logs -f task-api
# Rebuild and restart
docker compose up -d --build
# Stop all services
docker compose down
# Stop and remove volumes
docker compose down -vThis architecture is ideal for:
- Internal Tool Integration: Connect LLMs to company APIs securely
- Multi-Service Orchestration: Coordinate multiple microservices
- Agent Architectures: Build autonomous AI agents
- Enterprise AI: Production-grade LLM applications
- API Democratization: Natural language access to APIs
- FastMCP Documentation: https://gofastmcp.com
- MCP Protocol: https://modelcontextprotocol.io
- MCP Inspector: https://github.com/modelcontextprotocol/inspector
- FastAPI: https://fastapi.tiangolo.com
- LangGraph: https://langchain-ai.github.io/langgraph/
- GitHub Models: https://github.com/marketplace/models
Want to extend the demo? Try the PyCon Challenge:
CHALLENGE.md - Step-by-step guide to add:
- π Client Summary Tool (data aggregation)
- π User Info Prompt (template generation)
- Complete with code, testing instructions, and troubleshooting
Time: 30-40 minutes | Difficulty: Intermediate
- Task API README - API endpoints, testing, Docker setup
- MCP Server README - MCP tools, FastMCP patterns, Inspector usage
- CLAUDE.md - Development guidelines for Claude Code
- CHALLENGE.md - Hands-on challenge for workshop attendees
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- Built for PyCon presentation
- Powered by FastMCP framework
- Anthropic's Model Context Protocol
- FastAPI for the web framework
- AWS for serverless infrastructure
- GitHub Models for LLM access
Β‘Construyamos el futuro de la IA juntos! π