Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[utils] add merge_lora utility function #227

Merged
merged 16 commits into from Mar 30, 2023

Conversation

younesbelkada
Copy link
Collaborator

What does this PR do?

This PR introduces a new utility function merge_lora to conveniently merge lora layers into the base model, so that users can use this model as a standalone transformers model

Added also some tests (that are currently failing)

cc @pacman100

@younesbelkada younesbelkada requested review from pacman100 and removed request for pacman100 March 29, 2023 13:21
@younesbelkada
Copy link
Collaborator Author

This PR is now ready for review! all tests are passing

Copy link
Collaborator

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @younesbelkada for adding this feature which resolves many issues related to merging the lora params back into the base model without any wrappers on it.✨

Left a couple of comments and suggestions

src/peft/tuners/lora.py Outdated Show resolved Hide resolved
@@ -233,6 +238,34 @@ def enable_adapter_layers(self):
def disable_adapter_layers(self):
self._set_adapter_layers(enabled=False)

def merge_and_unload(self):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would not work when the model is loaded in 8bit. An assertion error for the same would be helpful.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the pointer! added a check for that

younesbelkada and others added 3 commits March 30, 2023 13:19
Co-authored-by: Sourab Mangrulkar <13534540+pacman100@users.noreply.github.com>
Copy link
Collaborator

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Final comments/suggestions

src/peft/tuners/lora.py Outdated Show resolved Hide resolved
Co-authored-by: Sourab Mangrulkar <13534540+pacman100@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants