Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix/ptl1.6.0 #888

Merged
merged 7 commits into from Apr 5, 2022
Merged

Fix/ptl1.6.0 #888

merged 7 commits into from Apr 5, 2022

Conversation

hrzn
Copy link
Contributor

@hrzn hrzn commented Apr 5, 2022

Fixes two issues that arised with Pytorch Lightning >= 1.6.0

  • The current_epoch is now "fixed" in PTL, so we have to make our own fix dynamic and check at runtime the version we are running
  • For some reason saving models using torch.save() only did not save the state of the PTL trainer correctly. So we revert to saving the PTL module directly using PTL checkpointing mechanism. Pros: it fixes the issue, and also will be more robust because not dependent e.g. on the current path. Con: It saves 2 files instead of one, and the parameters are saved twice. (I'll open a separate issue to address this if we can).

Copy link
Collaborator

@dennisbader dennisbader left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks a lot!

@hrzn hrzn merged commit efa955a into master Apr 5, 2022
@madtoinou madtoinou deleted the fix/ptl1.6.0 branch July 5, 2023 21:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants