Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please test and add the AdaBound optimizer into the next stable release #17560

Closed
xuancong84 opened this issue Feb 28, 2019 · 3 comments
Closed

Comments

@xuancong84
Copy link

Now we have another powerful general purpose optimizer called AdaBound that trains as fast as Adam and as good as SGD, see https://github.com/Luolc/AdaBound.

It appears to work better than Adam in several tasks including NLP and CV, with slightly higher accuracy and smoothier learning curve.

image

I think this would be a great and useful addition, thanks!

@vishwakftw
Copy link
Contributor

New implementations such as these can go into pytorch/contrib initially.

@kayuksel
Copy link

kayuksel commented May 23, 2019

I also agree with the request of adding AdaBound optimizer to contrib. The provided implementation by the author unfortunately does not have support for PyTorch 1.0

@soumith
Copy link
Member

soumith commented May 26, 2019

I'd agree, send a PR to https://github.com/pytorch/contrib with unit tests, I'm happy to merge

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants