Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AdaBelief Optimizer doesnt work #95

Closed
dnth opened this issue Apr 1, 2021 · 2 comments
Closed

AdaBelief Optimizer doesnt work #95

dnth opened this issue Apr 1, 2021 · 2 comments

Comments

@dnth
Copy link
Contributor

dnth commented Apr 1, 2021

I was toying around with the AdaBelief optimizer in tsai and found that it doesnt work with the notebook examples.
I think it is related to this issue

Specifically, the below code does not work
learn = ts_learner(dls, InceptionTime, metrics=[mae, rmse], cbs=ShowGraph(), opt_func=adabelief)

But, with a slight change it works again

opt_func = partial(OptimWrapper, opt=AdaBelief)
learn = ts_learner(dls, InceptionTime, metrics=[mae, rmse], cbs=ShowGraph(), opt_func=opt_func)

Here is the notebook I worked on
https://colab.research.google.com/drive/1QVKrnGx8y5FCFkLp-va7BXIlOGKelvql?usp=sharing

@oguiza
Copy link
Contributor

oguiza commented Apr 3, 2021

Thanks for raising this issue @dnth.
As you said the issue was due to the changes in the OptimWrapper. I've now updated it and it should work well.
I've verified it with this code:

X,y,splits=get_UCR_data('LSST', split_data=False)
tfms = [None, TSClassification()]
batch_tfms = TSStandardize(by_sample=True)
dls = get_ts_dls(X,y,splits=splits, tfms=tfms, batch_tfms=batch_tfms)
learn = ts_learner(dls, InceptionTime, metrics=accuracy, cbs=ShowGraph(), opt_func=adabelief)
learn.fit_one_cycle(5)

In addition to that, I've decided to make a dependency on torch-optimizer. That's a library that contains most of the optimizers in Pytorch. You can now choose between many optimizers. All you need to do is:

opt_func = wrap_optimizer(torch.optim.AdamW) # if it's a Pytorch optimizer
opt_func = wrap_optimizer(optim.Shampoo) # if it's one of torch-optimizer's

I'll make a new pip release during the weekend to reflect these changes.

I'll close this issue based on this and your previous post. If needed, please, feel free to reopen.

@oguiza oguiza closed this as completed Apr 3, 2021
@dnth
Copy link
Contributor Author

dnth commented Apr 3, 2021

Having access to torch-optimizer in tsai is super awesome! Thank you for the hard work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants