Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(model): Allow from_pretrained to accept PeftConfig class #612

Merged
merged 5 commits into from
Jun 27, 2023

Conversation

aarnphm
Copy link
Contributor

@aarnphm aarnphm commented Jun 21, 2023

Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really like the idea, thanks a lot for working on this !
Would you mind adding a simple test for that? It can either live here: https://github.com/huggingface/peft/blob/main/tests/test_config.py or here: https://github.com/huggingface/peft/blob/main/tests/testing_common.py (then you need to duplicate it on test_decoder_models.py and test_encoder_decoder_models.py)
Also I left a comment, maybe we should swap the order of positional arguments to avoid a breaking change. What do you think?
cc @pacman100 as well to make sure we're aligned on this

src/peft/peft_model.py Outdated Show resolved Hide resolved
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Jun 21, 2023

The documentation is not available anymore as the PR was closed or merged.

Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
@aarnphm
Copy link
Contributor Author

aarnphm commented Jun 22, 2023

I have updated according and added test cases for this. PTAL when you have time. Thanks

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for iterating, for adding the tests and adding missing docstring on from_pretrained!
This looks great to me, I left one suggestion, I think we should raise a proper error if the user didn't pass a PeftConfig. What do you think?

Also make sure to run the styling checks before pushing:

make style && make quality

Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
@aarnphm
Copy link
Contributor Author

aarnphm commented Jun 22, 2023

make sense to raise the error. Great recommendation!

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super clean work ! Thanks so much for this ! Let's wait for @pacman100 's review before merging this

Copy link
Contributor

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @aarnphm, Thank you for adding this, LGTM! 🤗. Post the resolution of the conflicts we can merge the PR.

Curious to know the usecase for this though as the config is always available with the saved model.

@aarnphm
Copy link
Contributor Author

aarnphm commented Jun 27, 2023

Hi @pacman100, an example use case would be:

When you configure the config class beforehand, I think it saves one cycle of constructing and dictionary lookup within PeftModel.from_pretrained.

Also let say we have a LLM class representing a model that contains multiple config kwargs

class LLM:
	peft_config_map = {"lora": {**lora_kwargs}, "prompt_tuning": {**prompt_tuning_kwargs}}

And LLM construct each of the peft config before hand, and only converts the model to PeftModel on the model server (since it is light to construct the config, whereas the model is heavy).

Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
@pacman100 pacman100 merged commit f5352f0 into huggingface:main Jun 27, 2023
11 checks passed
@aarnphm aarnphm deleted the feat/from-pretrained-peft-config branch June 27, 2023 16:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants