Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Configure PEFT from config #3571

Merged
merged 7 commits into from Jul 15, 2023
Merged

Configure PEFT from config #3571

merged 7 commits into from Jul 15, 2023

Conversation

shahules786
Copy link
Collaborator

@shahules786 shahules786 commented Jul 15, 2023

What

Added support to configure PEFT from config and save WTE embeddings with adapter files to enable easy loading of OA Lora weights.

Why

Earlier PEFT modules were hardcoded for llama model only. This was an issue when training other models using peft like RWModel, GPTNeoX, etc

How

Introduces extra parameter peft_config to config.yml

@github-actions
Copy link

pre-commit failed.
Please run pre-commit run --all-files locally and commit the changes.
Find more information in the repository's CONTRIBUTING.md

Copy link
Collaborator

@andreaskoepf andreaskoepf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very nice, thanks!

@andreaskoepf andreaskoepf merged commit b8637f2 into LAION-AI:main Jul 15, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants