A template for building agent-based applications that interact with LLMs through Ollama and with tools/resource through MCP
Agent Template provides a framework for running a UI that can interact with a locally-hosted Language Learning Model (LLM) using Ollama. This system allows the LLM to make decisions and execute workflows based on user inputs and available tools.
The template implements a Model-Control-Protocol (MCP) client architecture, designed to interact with a future "api" service using FastMCP to expose API function calls as MCP tools.
- UI interface for interacting with LLM agents
- Integration with Ollama for local LLM hosting
- MCP client implementation for structured communication
- Workflow execution framework
- Extensible tool system
The system consists of:
- UI Layer: User interface for agent interaction
- MCP Client: Handles communication protocol with LLMs
- Ollama Integration: Connects to locally-hosted LLM
- Workflow Engine: Executes sequences of actions
- Tool System: Expandable set of capabilities for the agent
- Ollama installed and configured
- Docker and Docker Compose
- Node.js and npm for development
-
Clone the repository:
git clone https://github.com/yourusername/agent-template.git cd agent-template -
Start the application using Docker Compose:
docker-compose up
- API Service: A Python FastAPI server using FastMCP to wrap API function calls as MCP tools
- Enhanced workflow capabilities
- Additional tool integrations
- Improved agent reasoning capabilities
Contributions are welcome! Please feel free to submit a Pull Request.