This repository provides a simple setup for running Ollama and Open WebUI with Traefik as a reverse proxy using Docker Compose. This setup automatically downloads the required models, so everything is set up with a single command.
This example is meant to run on a small VPS (Virtual Private Server) showing that you can run lightweight models on dual core + 8GB RAM systems.
I highly recommend using Hostinger, which offers excellent and affordable plans. Check them out with this link: Hostinger Plans
If you find this project useful, please consider giving it a star on GitHub! β Your support helps keep this project maintained and encourages further development.
Ensure you have the following installed:
-
Clone this repository:
git clone https://github.com/erickwendel/ollama-webui-traefik-docker.git cd ollama-webui-traefik-docker
-
Set up your domain by modifying the
.env
file:DOMAIN=srv665452.hstgr.cloud # Change this to your actual domain
-
Run the installation script (to install Docker if not already installed):
./install-docker.sh
-
Start the services:
docker-compose up -d
- Handles HTTPS with Let's Encrypt.
- Routes traffic to Ollama and Open WebUI.
- Configured via
traefik.yml
.
- Hosts AI models.
- Auto-downloads predefined models (
gemma:2b
,deepseek-r1:1.5b
,qwen2.5-coder:1.5b
,codegemma:2b
). - Accessible via
https://your-domain/ollama
.
- A helper service that ensures models are downloaded inside the Ollama container before usage.
- Provides a web-based interface for interacting with AI models.
- Accessible via
https://your-domain/
.
This Bash script allows users to get available model tags and send prompts to Ollama. Usage:
./request-ollama.sh "Your prompt here"
Uploads all files in the diretory to the VPS via SCP
This repository is intended for example purposes only and is not recommended for production use. For production deployments, consider using Kubernetes, Docker Swarm, or other orchestration tools to ensure high availability and security.
Modify the .env
file to set your domain:
DOMAIN=srv665452.hstgr.cloud
- This setup automatically downloads AI models inside the Ollama container.
- Make sure to configure your DNS settings to point your domain to your server's IP.
This project is licensed under the MIT License.