You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We work on databricks in isolation (no internet), and are unable to create references to our base/foundation/source models on HF hub.
We have language models in our model registry that we would like to reference using the model uri notation models:/llm_foundation/1 when we build our adapters (peft - qLORA / LORA).
But this gets downloaded from the registry and stored in a local0 (ephemeral location) within the databricks cluster, and any references to this model used for training points to this ephemeral location instead of our model registry.
TLDR:
Is it possible to make the source_model_name (Repository name of the base model) reference a model uri models:/llm_foundation/1 when registering a peft adapter? - instead of hugging face hub e.g. source_model_name: mistralai/Mistral-7B-v0.1.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
We work on databricks in isolation (no internet), and are unable to create references to our base/foundation/source models on HF hub.
We have language models in our model registry that we would like to reference using the model uri notation
models:/llm_foundation/1
when we build our adapters (peft
- qLORA / LORA).I have followed the tutorial at: https://mlflow.org/docs/latest/llms/transformers/tutorials/fine-tuning/transformers-peft.html, however it does not indicate that we can build these adapters on top of language models existing within our model registry.
I would like to reference a
source_model
that sits in our model registry.At the moment, we would load our foundation model / tokenizer as components
But this gets downloaded from the registry and stored in a
local0
(ephemeral location) within the databricks cluster, and any references to this model used for training points to this ephemeral location instead of our model registry.TLDR:
Is it possible to make the
source_model_name
(Repository name of the base model) reference a model urimodels:/llm_foundation/1
when registering a peft adapter? - instead of hugging face hub e.g.source_model_name: mistralai/Mistral-7B-v0.1.
Beta Was this translation helpful? Give feedback.
All reactions