mcp-agent-orchestrator is a production-ready starter template for building local-first Agentic AI systems. It demonstrates how to connect a FastMCP (Python) Server to a LangGraph Orchestrator using Ollama as the local LLM host.
- Standardized MCP Integration: Uses the Model Context Protocol to decouple tool logic from LLM logic.
- Stateful Orchestration: Built with LangGraph for robust, multi-turn agent conversations and error handling.
- Privacy-First: 100% local execution using Ollama (Llama 3.1/Mistral)—no API keys required.
- Modern UI: A clean Streamlit interface for real-time interaction with your MCP tools.
- FastMCP Framework: Simplified Python tool definition with automatic schema generation.
- MCP Host (Ollama): Provides the reasoning engine (LLM).
- MCP Server (FastMCP): Defines Python tools (e.g., SQLite, File System, Web Search).
- MCP Client: Manages the
stdiotransport and subprocess communication. - Orchestrator (LangGraph): A stateful ReAct agent that decides when to call MCP tools.
- Interface (Streamlit): The user-facing web application.
- Install Ollama and run
ollama run llama3.1. - Python 3.10 or higher.
- Clone the repository:
git clone https://github.com/NxtGenCodeBase/mcp-agent-orchestrator.git cd mcp-agent-orchestrator