Central chat API powered by the OpenAI Agents SDK with MCP (Model Context Protocol) support. Accepts authenticated requests, keeps per-session conversation history in Redis, and can connect to one or more MCP servers (e.g. finance-app) so the agent can use remote tools in conversation.
- Chat API: POST with
session_idandmessage; returns agent reply - API key authentication via
x-api-keyheader - Rate limiting: 20 requests per minute per client (SlowAPI)
- Session memory: Redis-backed conversation history (24h TTL) for multi-turn context
- MCP integration: Optional comma-separated list of MCP server SSE URLs; agent connects at first use and can call tools from those services
- Docker Compose with Redis; runs on a shared
webnetwork for Caddy and upstream MCP services
- Docker and Docker Compose
- An existing Docker network named
web(e.g.docker network create web) - OpenAI API key (for the agent)
-
Clone and enter the repo
git clone https://github.com/youssefaltai/chat-agent.git cd chat-agent -
Configure environment
cp .env.example .env
Edit
.envand set:OPENAI_API_KEY— your OpenAI API key (required for the agent)API_KEY— secret for the chat API (clients send this inx-api-keyheader; change from default)REDIS_URL— Redis connection URL (default:redis://redis:6379/0for Docker)MCP_SERVER_URLS— optional; comma-separated MCP SSE URLs (e.g.http://finance_app:3000/mcp/sse)
-
Run with Docker
docker compose up -d --build
The service runs as the
siri_agentcontainer and depends onredis(container namechat_redis). -
Verify
GET http://localhost:3000/health→{"status": "ok"}- Chat:
POST http://localhost:3000/with headerx-api-key: <API_KEY>and body{"session_id": "s1", "message": "Hello"}
| Variable | Description | Default |
|---|---|---|
OPENAI_API_KEY |
OpenAI API key for the agent | — (required) |
API_KEY |
Secret for x-api-key (chat API auth) |
change-me |
REDIS_URL |
Redis URL for session history | redis://redis:6379/0 |
MCP_SERVER_URLS |
Comma-separated MCP SSE URLs | — (optional) |
- GET /health — Health check; returns
{"status": "ok"}. - POST / — Chat. Requires header
x-api-keyand JSON body:Returns:{ "session_id": "string", "message": "string" }Rate limit: 20 requests per minute per IP.{ "reply": "string" }
app/
api.py # FastAPI app: health, chat endpoint, API key auth, rate limiting
agent.py # OpenAI Agents SDK agent; lazy init, MCP server connections
chat.py # Chat handler: load history, run agent, persist turn
memory.py # Redis-backed session history (get_history, append_message)
config.py # Pydantic settings from .env
- Caddy: Add a route, e.g.
handle_path /chat/* { reverse_proxy siri_agent:3000 }(adjust port if you change the app port). - MCP services: Run finance-app (or other MCP services) on the same
webnetwork and setMCP_SERVER_URLSso the agent can use their tools.
MIT