-
Notifications
You must be signed in to change notification settings - Fork 391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Allow local LLM backbones #15
Comments
Hi Julian, thanks for your interest in h2o-llmstudio! It is possible to specify a local folder in the LLM Backbone text field (below I use the local cache folder of Regarding your comment about private-flagged huggingface repositories: It should be possible to have access to private repositories once Huggingface hub access has been configured locally. |
For continuing to train from a previous experiment, there are two additional ways to the one elaborated by @maxjeblick: A: Use the GUI and
|
Awesome - and sorry for not evaluating that properly. I am working for the first time with your tool - exactly what I was looking for besides the fact that I cannot use one of the provided backbones. Thanks for answering! |
Thanks @JulianGerhard21 - please continue asking any questions if something is unclear. |
馃殌 Feature
Currently there is a fixed list of LLM backbones hosted and accessible via Huggingface. It would be nice to also be able to specify a local path to a pre-trained model in the UI dropdown. Since "only" CausalLanguageModeling is currently supported, loading or training the pre-trained model is of course subject to certain conditions that would have to be checked accordingly.
Motivation
The scenarios or motivations for the feature are many. On the one hand, I create models myself with the tool, which I may want to fine-tune further. On the other hand, there are models that I try out locally and want to work with. Also, this would be a transitional solution for models located in private-flagged huggingface repositories.
What do you think?
The text was updated successfully, but these errors were encountered: