Skip to content

duzaao/local_agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local Enterprise AI Agent (MCP + Local LLM)

This repository contains an internal demonstration of how to deploy a fully local enterprise-grade AI agent using:

  • Local LLMs via Ollama
  • Internal MCP server with restricted tools
  • FastAPI microservices (Auth + Flights)
  • Docker Compose to orchestrate the entire stack

The implementation corresponds to the architecture described in the article: “Local LLMs and MCP-Based Enterprise Agents: Feasibility, Challenges, and Preliminary Evaluation.”

This project is intended for internal usage only.


📁 Project Structure

local_agent/
├── agent/
│   ├── agent5.py
│   ├── Dockerfile
│   ├── requirements.txt
│   └── start.sh
├── api/
│   ├── Dockerfile
│   ├── requirements.txt
│   ├── scripts/
│   │   └── run_services.py
│   └── src/
│       ├── auth/
│       ├── flights/
│       └── shared/
├── mcp/
│   ├── server_new.py
│   ├── mcp.json
│   ├── Dockerfile
│   └── requirements.txt
├── infra/
│   ├── docker-compose.yml
│   ├── startup.sh
│   ├── test_stack.py
│   └── test_stack_fixed.py
├── terraform/
│   ├── main.tf
│   ├── variables.tf
│   ├── provider.tf
│   └── outputs.tf
├── questions.jsonl
└── README.md

📝 Environment Variables (.env)

Create a .env file at the project root:

MONGODB_URI=<internal-mongodb-uri>
MONGODB_DB=authsvc

JWT_SECRET=<your-secret>
JWT_ISSUER=authsvc
JWT_AUDIENCE=api
ACCESS_TOKEN_TTL_SECONDS=900
REFRESH_TOKEN_TTL_SECONDS=2592000

CUSTOMER_SERVICE_TOKEN=<internal-token>

LLM_PROVIDER=ollama
MODEL=deepseek-r1:8b

🤖 Selecting the LLM

Inside infra/docker-compose.yml:

environment:
  - LLM_PROVIDER=${LLM_PROVIDER}
  - MODEL=${MODEL}

Choose any model installed in Ollama:

ollama pull deepseek-r1:8b
ollama pull llama3.1:8b
ollama pull mistral:7b

🚀 How to Run

  1. Prepare .env
  2. Pull the LLM model into Ollama
  3. Start the stack:
cd infra/
docker compose up --build

This brings up:

  • Agent
  • MCP server
  • API microservices
  • Ollama runtime

🧪 Batch Evaluation Mode

If questions.jsonl exists:

python agent5.py --input questions.jsonl --output results.jsonl

Or inside the container:

docker exec -it infra-agent-1 sh
python agent5.py --api

📚 Reference

This repository corresponds to the implementation used in the article on local LLMs + MCP for enterprise agents.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors