Skip to content

adijayainc/LLM-ollama-webui-Raspberry-Pi5

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 

Repository files navigation

How to run TinyLlama Using Ollama-Webui on Raspberry Pi 5

How to run LLM (mistral 7b) on Raspberry Pi 5

Step-by-Step Guide on how to run TinyLama LLM on a Raspberry Pi 5 using Docker + Ollama + WebUI

Table of Content:

  1. Prerequisite
  2. Setup Raspberry Pi
  3. setup Docker Composer
  4. Setup Ollama WebUI:step-by-step Guide
  5. Download Model
  6. Extra Resoucres
  7. access API ollama

Prerequisite

Setup Raspberry Pi (Headless-setup)

You can also follow along this YouTube video instead.

  1. Connect the SD card to your laptop
  2. Download Raspberry Pi OS (bootloader): https://www.raspberrypi.com/software/
  3. Run it, and you should see: screenshot1.png
    • "Choose Device" - choose Raspberry Pi 5
    • OS, choose the latest (64bit is the recommended)
    • "Choose Storage" - choose the inserted SD card
  4. Now click next, and it will ask you if you want to edit the settings, click "Edit settings" screenshot2.png
  5. Configure screenshot3.png
    • enable hostname and set it to raspberrypi.local
    • Set username and password you will remember, we will use them shortly
    • Enable "Configure Wireless LAN" and add your wifi name and password
    • Click save, and contiue. it will take a few minutes to write everything to the SD
  6. Insert the SD card to your raspberry pi, and connect it to the electricity
  7. SSH into the Raspberry PI:
ssh ssh <YOUR_USERNAME>@raspberrypi.local

Setup Docker Composer

  1. install Docker :
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
  1. add user for running docker
sudo usermod -aG docker ${USER}
  1. Check docker installation
sudo su - ${USER}
docker version
docker run hello-world
  1. install docker composer
sudo apt-get install libffi-dev libssl-dev
sudo apt install python3-dev
sudo apt-get install -y python3 python3-pip
sudo pip3 install docker-compose

Setup Ollama-WebUI Step by Step Guide:

  1. Download the latest snapshot of ollama-webui :
git clone https://github.com/ollama-webui/ollama-webui webui
  1. create a docker compose file (you can change user pi ):
  version: "3.9"
  services:
 ollama:
   container_name: ollama
   image: ollama/ollama:latest
   restart: always
   volumes:
     - /home/pi/ollama:/root/.ollama
 ollama-webui:
   build:
     context: ./webui/
     args:
       OLLAMA_API_BASE_URL: '/ollama/api'
     dockerfile: Dockerfile
   image: ghcr.io/ollama/ollama-webui:main
   container_name: ollama-webui
   volumes:
     - ollama-webui:/app/backend/data
   depends_on:
     - ollama
   ports:
     - ${OLLAMA_WEBUI_PORT-3000}:8080
   environment:
     - 'OLLAMA_API_BASE_URL=http://ollama:11434/api'
   extra_hosts:
     - host.docker.internal:host-gateway
   restart: unless-stopped
  volumes:
 ollama-webui: {}
 ollama: {}
 
  1. bring the container up :
docker-compose up -d 
  1. Access the webui at http://localhost:3000
  2. create free account for first login
  3. Download the model you want to use (see below), by clicking on the litte cog icon selecting model Screenshot05.png
  4. For list of model model library.

That is it!

Running Tinyllama Model on Ollama Web UI

  1. Access the web ui login using username already created

Screenshot06.png

  1. Pull a model form Ollama.com , select tinyllama / mistral:7b

Screenshot07.png

  1. Testing asking to webui , "who are you ?"

Screenshot08.png

That is it!

Extra Resoucres:

Access API

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages