-
Notifications
You must be signed in to change notification settings - Fork 356
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add pytorch checkpoint hooks on load/save hooks [DET-5109] #2118
Conversation
65fd3a3
to
8a3835e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good overall. No big comments. Just a few minor comments.
return | ||
|
||
for callback in self.callbacks.values(): | ||
# QUESTION: should we encourage users to return new vals instead of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LightningModule used the original reference so that you could modify the checkpoint in place. But it looks like either way works.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Description
LightningModule
'son_load_checkpoint
andon_save_checkpoint
hooksTest Plan
Commentary (optional)
no need for release notes as the lightning adapter is only being officially released at the same time.
Checklist
docs/release-notes/
.See Release Note for details.