-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
deps: update peft to install from pypi #265
Conversation
Signed-off-by: Anh-Uong <anh.uong@ibm.com>
I tested this by running the tox unit tests and by running the |
Co-authored-by: Gabe Goodhart <gabe.l.hart@gmail.com> Signed-off-by: Anh Uong <anhuong4444@gmail.com>
How about just pinning |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets go with @dtrifiro's suggestion of pinning peft, that would be a bit more resilient to sudden or surprise breakage
Signed-off-by: Anh-Uong <anh.uong@ibm.com>
going to unblock due to maintainer availability, since requested changes were addressed
Verified training ability and inference after deployment (PT, MPT on causal lm [bloom] and seq2seq [flan-t5-xl]) Edit: update after revert of HF trainer in 0.4.0 - I verified that results were consistent on inference for the same models across version changes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
relates to: #263