Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trainers: skip weights and augmentations when saving hparams #1670

Merged
merged 7 commits into from
Nov 10, 2023

Conversation

dylanrstewart
Copy link
Contributor

@dylanrstewart dylanrstewart commented Oct 16, 2023

Fixes #1622
Fixes #1639

@github-actions github-actions bot added the trainers PyTorch Lightning trainers label Oct 16, 2023
@dylanrstewart dylanrstewart changed the title Update base.py to fix for custom augmentations Update base.py to fix for custom augmentations #1622 Oct 16, 2023
@dylanrstewart dylanrstewart changed the title Update base.py to fix for custom augmentations #1622 Update base.py to fix for custom augmentations Oct 16, 2023
@dylanrstewart
Copy link
Contributor Author

Fix for #1622

@adamjstewart
Copy link
Collaborator

So this fixes MoCo, but not SimCLR. Would love a more flexible way of doing this that only affects one trainer instead of all of them. We could either move save_hyperparameters out of base and into each trainer, or add an ignore class attribute. Thoughts?

Also want to add tests. Can modify one of the (moco|simclr)*.yaml files to use a different augmentation.

@adamjstewart adamjstewart added this to the 0.5.1 milestone Oct 16, 2023
@adamjstewart adamjstewart changed the title Update base.py to fix for custom augmentations Trainers: skip weights and augmentations when saving hparams Nov 6, 2023
@adamjstewart
Copy link
Collaborator

@dylanrstewart I extended your idea to make it more generalizable. Hope you don't mind. Now subclasses can decide exactly which arguments they want to ignore.

@adamjstewart
Copy link
Collaborator

Also added a test that would fail on main but ensures that augmentation overrides work on this branch.

@github-actions github-actions bot added the testing Continuous integration testing label Nov 6, 2023
@adamjstewart
Copy link
Collaborator

@dylanrstewart if these changes look good to you, can you agree to the CLA? That's the only remaining thing preventing this PR from being merged. See https://github.com/microsoft/contributorlicenseagreement#accepting for instructions.

@dylanrstewart
Copy link
Contributor Author

@microsoft-github-policy-service agree

@adamjstewart adamjstewart reopened this Nov 6, 2023
@adamjstewart
Copy link
Collaborator

Not sure why CLA bot isn't working. Usually closing and reopening works.

@adamjstewart adamjstewart reopened this Nov 6, 2023
@adamjstewart
Copy link
Collaborator

Can you try accepting again?

@adamjstewart
Copy link
Collaborator

Ping @dylanrstewart can you try accepting the CLA again?

@dylanrstewart
Copy link
Contributor Author

@microsoft-github-policy-service agree

@adamjstewart adamjstewart merged commit bec09c2 into microsoft:main Nov 10, 2023
72 checks passed
nilsleh pushed a commit that referenced this pull request Nov 10, 2023
* Update base.py to fix for custom augmentations

* Allow subclasses to ignore specific arguments

* Fix typing

* Save to self.weights

* pyupgrade

* Add test

* Save weights

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
testing Continuous integration testing trainers PyTorch Lightning trainers
Projects
None yet
3 participants