Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update request: ollama 0.1.31 → 0.1.32 #304823

Closed
1 task done
peperunas opened this issue Apr 17, 2024 · 11 comments
Closed
1 task done

Update request: ollama 0.1.31 → 0.1.32 #304823

peperunas opened this issue Apr 17, 2024 · 11 comments
Labels
9.needs: package (update) This needs a package to be updated

Comments

@peperunas
Copy link
Contributor

peperunas commented Apr 17, 2024

  • Package name: ollama
  • Latest released version: 0.1.32
  • Current version on the unstable channel: 0.1.31
  • Current version on the stable/release channel: 0.1.28

Notify maintainers

@abysssol


Note for maintainers: Please tag this issue in your PR.


Add a 👍 reaction to issues you find important.

@peperunas peperunas added the 9.needs: package (update) This needs a package to be updated label Apr 17, 2024
@abysssol
Copy link
Contributor

I tried updating ollama, but it appears to have a problem detecting rocm libraries (similar to #297081), so it'll probably take a while for me to figure it out. I don't want to update ollama if I know it'll break the rocm build.

@nonetrix
Copy link
Contributor

nonetrix commented Apr 29, 2024

I tried updating ollama, but it appears to have a problem detecting rocm libraries (similar to #297081), so it'll probably take a while for me to figure it out. I don't want to update ollama if I know it'll break the rocm build.

Was it perhaps this? That's fixed now and merged in NixOS Unstable, ollama 0.1.32 would be great for WizardLM 2 8x22B support

@abysssol
Copy link
Contributor

Unfortunately, that was a different problem that broke ollama in an entirely separate way (#306654) (interestingly enough, both problems only affected rocm).

Apologies for being unable to keep ollama up to date, but it seems this might be beyond my abilities. I haven't entirely given up yet, but I'm not spending much time on this anymore.

Do you think it would be better to keep ollama up to date and just leave rocm broken?

Also, I updated my ollama flake to 0.1.32, if anyone wants to use that for now. https://github.com/abysssol/ollama-flake

@nonetrix
Copy link
Contributor

nonetrix commented Apr 29, 2024

No problem, I think I will just use that flake instead for now. Better not to break things, but wouldn't be unheard of in Unstable I guess but I am not really a expert in the project. I personally am using a AMD GPU though, so that would be somewhat annoying but I could live without it since well.. Most of the time I am only offloading small parts of the model in VRAM due to "only" having 16GBs of it

@nonetrix nonetrix mentioned this issue May 5, 2024
13 tasks
@jonas-w

This comment was marked as off-topic.

@abysssol

This comment was marked as off-topic.

@jonas-w

This comment was marked as off-topic.

@volfyd
Copy link
Contributor

volfyd commented May 23, 2024

Can this be closed now that #312608 has been merged?

@peperunas
Copy link
Contributor Author

Thanks for your great work @abysssol! 🙏

@abysssol
Copy link
Contributor

@volfyd was a huge help. I doubt I would have gotten rocm working without him.
Thank you @volfyd!

@mastoca
Copy link

mastoca commented May 25, 2024

@abysssol & @volfyd thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
9.needs: package (update) This needs a package to be updated
Projects
None yet
Development

No branches or pull requests

6 participants