-
-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for LLaMa and Alpaca #16
Comments
In case you're interested, it looks like llama.cpp already has a flake.nix at its root https://github.com/ggerganov/llama.cpp/ |
This space moves quick. llama.cpp's nix flake works well, but does not provide an overlay, which is not necessarily a problem. |
I was also thinking we could make nix functions which yield derivations that compute the prompt. Imagine
|
Resolved via https://github.com/nixified-ai/flake/releases/tag/2. text-generation-webui is now available for Nvidia users. |
This project is awesome and I'm so glad it exists.
However I'm wonder about support dalai or text-generation-web-ui for running LLaMA and Alpaca. These models are much more suitable to be run on home computers and I believe would make an excellent addition to nixified.ai :)
I'd be open to helping as well, though my Nix-fu is not quite as powerful as yours.
The text was updated successfully, but these errors were encountered: