You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now we have another powerful general purpose optimizer called AdaBound that trains as fast as Adam and as good as SGD, see https://github.com/Luolc/AdaBound.
It appears to work better than Adam in several tasks including NLP and CV, with slightly higher accuracy and smoothier learning curve.
I think this would be a great and useful addition, thanks!
The text was updated successfully, but these errors were encountered:
I also agree with the request of adding AdaBound optimizer to contrib. The provided implementation by the author unfortunately does not have support for PyTorch 1.0
Now we have another powerful general purpose optimizer called AdaBound that trains as fast as Adam and as good as SGD, see https://github.com/Luolc/AdaBound.
It appears to work better than Adam in several tasks including NLP and CV, with slightly higher accuracy and smoothier learning curve.
I think this would be a great and useful addition, thanks!
The text was updated successfully, but these errors were encountered: