Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement] Let optimizer use built torch opt #3642

Merged
merged 2 commits into from
May 11, 2022

Conversation

muellerzr
Copy link
Contributor

@muellerzr muellerzr commented May 11, 2022

This PR slightly tweaks the constructor for OptimWrapper, by allowing it to use an already built pytorch optimizer. This is beneficial for when other libraries need to use a built optimizer directly to work with it. Prime need is Accelerate can prepare an optimizer, but it converts it back to pytorch. We then need to wrap it again in OptimWrapper.

Example use case:

# some func to build a learner
learn = get_learner(...)
opt = accelerator.prepare_optimizer(learn.opt)
# Rewrap the optimizer
learn.opt = OptimWrapper(opt=opt)

To also have it be compatible with Accelerate, step takes in a closure param, but will raise an error if it's anything but None. This keeps it compatible with regular torch optimizers as well, though the implementation is not there (which @tmabraham is working on I believe)
cc @jph00

@muellerzr muellerzr requested a review from jph00 as a code owner May 11, 2022 13:25
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@jph00 jph00 merged commit 2106700 into fastai:master May 11, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants