-
Notifications
You must be signed in to change notification settings - Fork 880
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make safetensors the default #2120
Conversation
The documentation is not available anymore as the PR was closed or merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! Something that we're doing in transformers as well that you might want to do here, is to continue testing for safe_serialization=False
in the common pathways.
By removing testing of the serialization of pytorch_model.bin
(which we're doing as the default isn't explicitly defined in the tests and the default just changed), we're risking breaking that code without noticing.
You implemented another test, do you think maybe others would need implementing? That's what we did on transformers
side (not merged yet): huggingface/transformers#27242
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general, this LGTM. I have a few questions for my understanding, the implementation looks clean though.
The whole of fsdp_utils.py
still relies on torch.save
, should this also be adjusted?
Good point.
I think this is something where test coverage could be helpful, as it would reveal if we have code paths that are no longer taken but which used to be taken. |
@LysandreJik @BenjaminBossan I've updated all of our tests that do anything with |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM ! I've left a few comments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks Zach. I have a few comments but none are blockers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great, LGTM, thanks.
If `safe_serialization` is `True`, models will be saved with `safetensors` while the rest are saved using native | ||
`pickle`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
馃憤
return f"{func.__name__}_{param_based_name}" | ||
|
||
|
||
@parameterized_class(("use_safetensors",), [[True], [False]], class_name_func=parameterized_custom_name_func) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TIL about parameterized_class
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for iterating @muellerzr! LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for iterating !
What does this PR do?
Similar to transformers, makes
safetensors
the default and a library requirement 馃Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@LysandreJik @SunMarc @BenjaminBossan