Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix optimwrapper to work with param_groups (closes #2829) #3241

Merged
merged 6 commits into from
Mar 6, 2021

Conversation

tmabraham
Copy link
Contributor

Currently, OptimWrapper does not support param_groups, preventing use of model freezing/unfreezing and discriminative learning rate.

This PR incorporates @muellerzr's fix into OptimWrapper. In doing so, the usage of OptimWrapper is slightly changed. For example, it changes usage from:

def opt_func(params, lr, **kwargs): return OptimWrapper(torch.optim.SGD(params, lr=lr))

to

def opt_func(params, **kwargs): return OptimWrapper(torch.optim.SGD, params, **kwargs)

This PR will also solve issue #2829 as well.

@tmabraham tmabraham requested a review from jph00 as a code owner March 5, 2021 07:49
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@tmabraham
Copy link
Contributor Author

I just realized that by switching the arguments of OptimWrapper, I can make it even easier to use:
opt_func = partial(OptimWrapper, opt=torch.optim.SGD)

I have added this change as a commit, but @jph00 please let me know if you are fine with this. If you think it's too much change to the OptimWrapper usage, I will revert back to the previously agreed upon changes...

Copy link
Member

@jph00 jph00 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good! Minor suggestions

fastai/optimizer.py Outdated Show resolved Hide resolved
fastai/optimizer.py Outdated Show resolved Hide resolved
fastai/optimizer.py Outdated Show resolved Hide resolved
tmabraham and others added 2 commits March 5, 2021 19:58
@tmabraham
Copy link
Contributor Author

@jph00 I made changes based on your suggestions, and also added the doc changes from #3226

@jph00
Copy link
Member

jph00 commented Mar 6, 2021

@tmabraham any chance you could look to see why one notebook is timing out in tests in CI?

@tmabraham
Copy link
Contributor Author

@jph00 fixed! surprised this wasn't an issue before but one of the cells was not marked slow...

@jph00 jph00 added the breaking label Mar 6, 2021
@jph00
Copy link
Member

jph00 commented Mar 6, 2021

Thank you! :)

@jph00 jph00 merged commit 89b05f5 into fastai:master Mar 6, 2021
@tmabraham tmabraham deleted the update-optimwrapper branch August 26, 2021 23:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants