Skip to content

Files

Latest commit

dfd7ff1 · Apr 23, 2024

History

History
This branch is 192 commits behind Haxxnet/Compose-Examples:main.

ollama-ui

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
Apr 23, 2024
Apr 23, 2024

References

Notes

You can spawn Ollama first and then download the respective LLM models via docker exec. Alternatively, spawn the whole stack directly and download LLM models within Open WebUI using a browser.

# spawn ollama and ui
docker compose up -d

# (optional) download an llm model via docker exec
docker exec ollama ollama run llama3:8b

Afterwards, we can browse Open WebUI on http://127.0.0.1:8080 and register our first user account. You may want to disable open user registration later on by uncommenting the env ENABLE_SIGNUP variable and restarting the Open WebUI container.

Tip

You likely want to pass a GPU into the Ollama container. Please read this.