Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Download, save and load LLMs from external drive #314

Open
yioannides opened this issue Sep 19, 2024 · 1 comment
Open

Download, save and load LLMs from external drive #314

yioannides opened this issue Sep 19, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@yioannides
Copy link

Hello, thank you for this great application! I wanted to ask whether it would be possible to save all downloaded LLMs on an external / secondary internal drive, instead of the default location.

LLMs are massive in size, so Alpaca could definitely benefit from a custom location option in the Preferences.

@yioannides yioannides added the enhancement New feature or request label Sep 19, 2024
@CodingKoalaGeneral
Copy link

CodingKoalaGeneral commented Sep 19, 2024

compatibility of using ollama cmd would be nice to know, pulling them from there if existing would be need too.
i'm currently using the ollama api for vscoding support since the download of Alpaca of the llama3.1 405b got stuck at 75% 🥇

INFO	[connection_handler.py | start] client version is 0.3.9
INFO	[connection_handler.py | request] GET : http://127.0.0.1:11435/api/tags
Gdk-Message: 20:27:50.308: Error 71 (Protocol error) dispatching to Wayland display.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants