Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for LLaMa and Alpaca #16

Closed
Chickensoupwithrice opened this issue Apr 2, 2023 · 4 comments
Closed

Support for LLaMa and Alpaca #16

Chickensoupwithrice opened this issue Apr 2, 2023 · 4 comments

Comments

@Chickensoupwithrice
Copy link

Chickensoupwithrice commented Apr 2, 2023

This project is awesome and I'm so glad it exists.

However I'm wonder about support dalai or text-generation-web-ui for running LLaMA and Alpaca. These models are much more suitable to be run on home computers and I believe would make an excellent addition to nixified.ai :)

I'd be open to helping as well, though my Nix-fu is not quite as powerful as yours.

@Chickensoupwithrice Chickensoupwithrice changed the title Support for other "AI Stacks" Support for other Dalai Apr 2, 2023
@Chickensoupwithrice Chickensoupwithrice changed the title Support for other Dalai Support for other Dalai, LLaMa and Alpaca Apr 2, 2023
@Chickensoupwithrice Chickensoupwithrice changed the title Support for other Dalai, LLaMa and Alpaca Support for Dalai, LLaMa and Alpaca Apr 2, 2023
@Chickensoupwithrice Chickensoupwithrice changed the title Support for Dalai, LLaMa and Alpaca Support for LLaMa and Alpaca Apr 2, 2023
@MatthewCroughan
Copy link
Member

In case you're interested, it looks like llama.cpp already has a flake.nix at its root https://github.com/ggerganov/llama.cpp/

@Chickensoupwithrice
Copy link
Author

This space moves quick. llama.cpp's nix flake works well, but does not provide an overlay, which is not necessarily a problem.
However there's new kids out on the block now too: Dolly and OpenAsisstant. Though both of these are largely focused on providing open datasets, building these projects with Nix is still a useful undertaking imo

@MatthewCroughan
Copy link
Member

I was also thinking we could make nix functions which yield derivations that compute the prompt. Imagine

generateText "prompt" which would run llama.cpp in the sandbox. Relaxed Sandbox mode would allow access to the GPU.

@max-privatevoid
Copy link
Member

Resolved via https://github.com/nixified-ai/flake/releases/tag/2. text-generation-webui is now available for Nvidia users.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants