Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

optimizer wrapper improve docs #3226

Closed
wants to merge 1 commit into from

Conversation

tmabraham
Copy link
Contributor

There is often confusion about how to use OptimWrapper (so we can use PyTorch optimizers). Even @muellerzr has experienced some confusion in this regard (see #3225). This PR is to improve some of the documentation to clarify this. I don't know if this is the best approach so I'm open to moving the text I wrote to another location, or adding some text somewhere else, etc...

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@jph00
Copy link
Member

jph00 commented Feb 17, 2021

Do we know what's changed that's caused the compose()/inline approach to not work any more? cc @muellerzr

@muellerzr
Copy link
Contributor

@jph00 I sadly don't particularly know what exactly is causing the issue. (I also don't know what version fixed this, as I didn't really use this functionality much).

For some reason it looks like it's trying to do something like AdamW()() (IE call the class instance) rather then .step I believe? But I also don't know what the old behavior exactly did.

@tmabraham used it much more, so he may have an idea?

@tmabraham
Copy link
Contributor Author

There are two problems when it comes to the compose approach. This is what I am trying:

opt_func = compose(partial(torch.optim.Adam, eps=1e-6), OptimWrapper)

(I had also tried something similar earlier with AdaBelief)

If I try passing this opt_func into Learner like this:

learn_ = Learner(dls, resnet34(), opt_func=opt_func, metrics=error_rate)
learn_.fit(1)

I get this error:

TypeError: __init__() got an unexpected keyword argument 'lr'

I believe this is a limitation of fastcore's compose function. compose passes the same kwargs into all of the composed functions, but OptimWrapper doesn't expect lr and other related arguments, hence the error.

If I try with cnn_learner like this:

path = untar_data(URLs.PETS)/'images'

def is_cat(x): return x[0].isupper()
dls = ImageDataLoaders.from_name_func(
    path, get_image_files(path), valid_pct=0.2, seed=42,
    label_func=is_cat, item_tfms=Resize(224))

learn = cnn_learner(dls, resnet34, opt_func=opt_func, metrics=error_rate)
learn.fit(1)

I get the following error:

TypeError: optimizer can only optimize Tensors, but one of the params is list

I am not sure why I get this error, but it may have something to do with the param_lists/the splitting of the model.

For these reasons, the compose approach does not currently work...

@tyoc213
Copy link
Contributor

tyoc213 commented Feb 19, 2021

It will be nice to know the pytorch commit that caused this, I guess pytorch is 1.7.0 but has been since some time?. So I guess the different behaviour was introduced from 1.6 to current 1.7 isn't? or we can't be sure about that?

As a side note from what I know from the TPU thing, there are some tiny diffs very internal to optimizers from fastai and pytorch so perhaps could be a good time to test in deep compatibility between the two?

tmabraham added a commit to tmabraham/fastai that referenced this pull request Mar 6, 2021
@tmabraham
Copy link
Contributor Author

closing and bringing proposed changes to #3241

@tmabraham tmabraham closed this Mar 6, 2021
jph00 added a commit that referenced this pull request Mar 6, 2021
* fix optimwrapper to work with param_groups

* change optimwrapper to make it even easier to use

* Update fastai/optimizer.py

Co-authored-by: Jeremy Howard <github@jhoward.fastmail.fm>

* incorporate jeremy's suggestions and add #3226 doc improvements

* add #slow flag to cell in 14_callback.schedule.ipynb

* switch to #cuda

Co-authored-by: Jeremy Howard <github@jhoward.fastmail.fm>
@tmabraham tmabraham deleted the optimwrapper-docs branch August 26, 2021 23:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants