A Docker-based local coding agent project that runs an Ollama model container and a Python agent container. The agent connects to Ollama over an internal bridge network and provides an interactive terminal experience.
Note: The Agent is just a ChatBot as of now (agentic capabilites are still in devellopment).
This project builds a secure local coding assistant using:
ollama/ollama:latestas the LLM serving backend- a custom Python-based
agentcontainer for interaction - Docker Compose for orchestration and isolation
The agent is designed to run inside Docker and connect to Ollama over an isolated internal network.
-
ollamaservice- Runs the Ollama server
- Serves any model
OLLAMA_MODEL=<model>configured in.envfile - Uses
/scripts/start_ollama.shto configureOLLAMA_HOSTand launch the server - Exposes port
11434internally only
-
agentservice- Builds from
agent/Dockerfile - Contains the Python app in
agent/app/main.py - Loads environment variables from
.env - Uses
docker compose exec agent python3 main.pyfor interactive sessions
- Builds from
local_code_networkdriver: bridge
ollama_datavolume- Persists downloaded Ollama models
- Keeps model files across restarts
This project uses several Docker hardening measures:
-
expose: 11434for Ollama- does not publish a host port
- only makes the port available to containers on the internal network
-
read_only: trueon agent container- prevents write operations to the container filesystem
-
cap_drop: - ALL- removes Linux capabilities from the agent container
-
no-new-privileges: true- blocks privilege escalation
-
Non-root agent user
- agent container runs as
agent - reduces risk from container compromise
- agent container runs as
Note: If you want host-only access to Ollama for testing, use
127.0.0.1:11434:11434instead ofexpose.
The project is orchestrated with docker-compose.yaml.
Key services:
ollamaagent
Key files:
docker-compose.yamlagent/Dockerfileagent/app/main.pyagent/app/requirements.txtscripts/start_ollama.sh.env
- Docker Engine
- Docker Compose (v2 or greater)
- Git (optional)
- Sufficient disk space for Ollama model files
- Docker
- Docker Compose
- Debian-based Python container
- FastAPI / SQLAlchemy dependencies in the agent environment
httpxfor Ollama API requestspython-dotenvfor environment variable loading
git clone https://github.com/himmat12/local-coding-agent.git local_coding_agent
cd local_coding_agentCreate or update the .env file with:
OLLAMA_HOST=http://ollama:11434
OLLAMA_MODEL=<model> # qwen2.5-coder:7b
AGENT_WORKSPACE=/workspacedocker compose build
docker compose up -dThis will start the Ollama and agent containers.
docker compose psYou should see both ollama and local-code-agent running.
The agent is designed for interactive terminal sessions.
docker compose exec agent python3 main.py- The agent starts
- It prints startup info
- It prints a list of available commands
- You can type questions or use utility commands
Within the interactive agent terminal, use these commands:
help,--help,?- Display available commands
history,show history- Print the chat history with timestamps
clear- Clear the current chat session history
status- Display session status and model info
exit,quit- Exit the agent cleanly
-
docker-compose.yaml- Defines the Ollama and agent services
- Configures the internal Docker network
-
agent/Dockerfile- Builds the agent container from Debian
- Installs Python and required tools
- Installs Python dependencies from
agent/app/requirements.txt
-
scripts/start_ollama.sh- Starts Ollama with
OLLAMA_HOST=0.0.0.0 - Ensures Ollama listens on the internal Docker network
- Starts Ollama with
-
agent/app/main.py- Contains the interactive Python agent logic
- Uses HTTP requests to Ollama to generate responses
- Includes loading animation and built-in command support
- The agent is intentionally not started automatically in the container CMD because it is interactive.
- Use
docker compose exec agent python3 main.pywhen you want a live session. - The
agentcontainer is read-only and usestmpfsfor temporary runtime data.
To see services logs:
docker compose logs -f ollama # for ollama running service logsdocker compose logs -f agent # for agent running service logsMake sure you launch it via:
docker compose exec agent python3 main.pyConfirm the internal network and service health:
docker compose psThis usually means the interactive terminal session was killed.
Run again with docker compose exec agent python3 main.py.
- Add a dedicated CLI wrapper script inside the agent container
- Add model download validation to
scripts/start_ollama.sh - Add logs and runtime metrics for the agent
- Add a documented
docker-compose.override.ymlfor development