-
-
Notifications
You must be signed in to change notification settings - Fork 14.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update request: ollama 0.1.31 → 0.1.32 #304823
Comments
I tried updating ollama, but it appears to have a problem detecting rocm libraries (similar to #297081), so it'll probably take a while for me to figure it out. I don't want to update ollama if I know it'll break the rocm build. |
Was it perhaps this? That's fixed now and merged in NixOS Unstable, ollama 0.1.32 would be great for WizardLM 2 8x22B support |
Unfortunately, that was a different problem that broke ollama in an entirely separate way (#306654) (interestingly enough, both problems only affected rocm). Apologies for being unable to keep ollama up to date, but it seems this might be beyond my abilities. I haven't entirely given up yet, but I'm not spending much time on this anymore. Do you think it would be better to keep ollama up to date and just leave rocm broken? Also, I updated my ollama flake to 0.1.32, if anyone wants to use that for now. https://github.com/abysssol/ollama-flake |
No problem, I think I will just use that flake instead for now. Better not to break things, but wouldn't be unheard of in Unstable I guess but I am not really a expert in the project. I personally am using a AMD GPU though, so that would be somewhat annoying but I could live without it since well.. Most of the time I am only offloading small parts of the model in VRAM due to "only" having 16GBs of it |
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
Can this be closed now that #312608 has been merged? |
Thanks for your great work @abysssol! 🙏 |
Notify maintainers
@abysssol
Note for maintainers: Please tag this issue in your PR.
Add a 👍 reaction to issues you find important.
The text was updated successfully, but these errors were encountered: