- Prerequisite
- Setup Raspberry Pi
- setup Docker Composer
- Setup Ollama WebUI:step-by-step Guide
- Download Model
- Extra Resoucres
- access API ollama
- Raspberry Pi 5, 8GB RAM
- 32GB SD Card
You can also follow along this YouTube video instead.
- Connect the SD card to your laptop
- Download Raspberry Pi OS (bootloader): https://www.raspberrypi.com/software/
- Run it, and you should see:
- "Choose Device" - choose Raspberry Pi 5
- OS, choose the latest (64bit is the recommended)
- "Choose Storage" - choose the inserted SD card
- Now click next, and it will ask you if you want to edit the settings, click "Edit settings"
- Configure
- enable hostname and set it to
raspberrypi
.local - Set username and password you will remember, we will use them shortly
- Enable "Configure Wireless LAN" and add your wifi name and password
- Click save, and contiue. it will take a few minutes to write everything to the SD
- enable hostname and set it to
- Insert the SD card to your raspberry pi, and connect it to the electricity
- SSH into the Raspberry PI:
ssh ssh <YOUR_USERNAME>@raspberrypi.local
- install Docker :
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
- add user for running docker
sudo usermod -aG docker ${USER}
- Check docker installation
sudo su - ${USER}
docker version
docker run hello-world
- install docker composer
sudo apt-get install libffi-dev libssl-dev
sudo apt install python3-dev
sudo apt-get install -y python3 python3-pip
sudo pip3 install docker-compose
- Download the latest snapshot of ollama-webui :
git clone https://github.com/ollama-webui/ollama-webui webui
- create a docker compose file (you can change user pi ):
version: "3.9"
services:
ollama:
container_name: ollama
image: ollama/ollama:latest
restart: always
volumes:
- /home/pi/ollama:/root/.ollama
ollama-webui:
build:
context: ./webui/
args:
OLLAMA_API_BASE_URL: '/ollama/api'
dockerfile: Dockerfile
image: ghcr.io/ollama/ollama-webui:main
container_name: ollama-webui
volumes:
- ollama-webui:/app/backend/data
depends_on:
- ollama
ports:
- ${OLLAMA_WEBUI_PORT-3000}:8080
environment:
- 'OLLAMA_API_BASE_URL=http://ollama:11434/api'
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama-webui: {}
ollama: {}
- bring the container up :
docker-compose up -d
- Access the webui at http://localhost:3000
- create free account for first login
- Download the model you want to use (see below), by clicking on the litte cog icon selecting model
- For list of model model library.
That is it!
- Access the web ui login using username already created
- Pull a model form Ollama.com , select tinyllama / mistral:7b
- Testing asking to webui , "who are you ?"
That is it!