You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have tried with several different models, gemma:2b being the smallest, to see if size was the issue. This is the log from the container when I crash:
Edit: and FWIW this error occured both when I tried to run the model from the ollama cli like I demonstrated, and when I tried to connect via open-webui, so it's not an issue specific to using the cli.
The text was updated successfully, but these errors were encountered:
Actually give me one moment, I am doing to try installing one more thing to see if it fixes it.
Edit: Scratch that. I realized there is an amdgpu-firmware package in nonguix that I did not have installed before. I installed it now, but I am running into the same issue.
Sick I was able to figure it out. I found this comment ROCm/Tensile#1936 which suggested setting the environment variable HSA_OVERRIDE_GFX_VERSION=10.3.0. That worked for me. The full command I ran to get it to work is: docker run -d -e HSA_OVERRIDE_GFX_VERSION=10.3.0 --device /dev/kfd --device /dev/dri -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama:0.1.27-rocm
I am running GNU Guix with the following ROCm packages installed:
My upstream Guix commit is
8c0282c
. Here is the output ofrocminfo
:Here I am running
rocm-smi
inside a Docker container:So I am fairly sure I have ROCm setup correctly with Docker. When I run the following, I get a crash:
I have tried with several different models,
gemma:2b
being the smallest, to see if size was the issue. This is the log from the container when I crash:Edit: and FWIW this error occured both when I tried to run the model from the
ollama
cli like I demonstrated, and when I tried to connect viaopen-webui
, so it's not an issue specific to using the cli.The text was updated successfully, but these errors were encountered: