-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(model): Allow from_pretrained to accept PeftConfig class #612
feat(model): Allow from_pretrained to accept PeftConfig class #612
Conversation
Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really like the idea, thanks a lot for working on this !
Would you mind adding a simple test for that? It can either live here: https://github.com/huggingface/peft/blob/main/tests/test_config.py or here: https://github.com/huggingface/peft/blob/main/tests/testing_common.py (then you need to duplicate it on test_decoder_models.py
and test_encoder_decoder_models.py
)
Also I left a comment, maybe we should swap the order of positional arguments to avoid a breaking change. What do you think?
cc @pacman100 as well to make sure we're aligned on this
The documentation is not available anymore as the PR was closed or merged. |
Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
I have updated according and added test cases for this. PTAL when you have time. Thanks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for iterating, for adding the tests and adding missing docstring on from_pretrained
!
This looks great to me, I left one suggestion, I think we should raise a proper error if the user didn't pass a PeftConfig
. What do you think?
Also make sure to run the styling checks before pushing:
make style && make quality
Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
make sense to raise the error. Great recommendation! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Super clean work ! Thanks so much for this ! Let's wait for @pacman100 's review before merging this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello @aarnphm, Thank you for adding this, LGTM! 🤗. Post the resolution of the conflicts we can merge the PR.
Curious to know the usecase for this though as the config is always available with the saved model.
Hi @pacman100, an example use case would be: When you configure the config class beforehand, I think it saves one cycle of constructing and dictionary lookup within Also let say we have a LLM class representing a model that contains multiple config kwargs class LLM:
peft_config_map = {"lora": {**lora_kwargs}, "prompt_tuning": {**prompt_tuning_kwargs}} And LLM construct each of the peft config before hand, and only converts the model to PeftModel on the model server (since it is light to construct the config, whereas the model is heavy). |
Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
Signed-off-by: Aaron 29749331+aarnphm@users.noreply.github.com