Skip to content

Add support for scheduled weight decays in RectifiedAdam. #1974

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

leandro-gracia-gil
Copy link
Contributor

@leandro-gracia-gil leandro-gracia-gil commented Jul 7, 2020

RAdam implements weight decay based on AdamW, and the latter supports scheduling for both learning rate and weight decays.

This patch extends existing support of Keras schedulers for the learning rate to weight decay, matching the weight decay features of AdamW.

fixes #1908

RAdam implements weight decay based on AdamW, and the latter supports scheduling
for both learning rate and weight decays as part of its warm restarts version.

This patch extends existing support of Keras schedulers for the learning rate to
weight decay, matching the weight decay features of AdamW.
@bot-of-gabrieldemarmiesse

@CyberZHG

You are owner of some files modified in this pull request.
Would you kindly review the changes whenever you have the time to?
Thank you very much.

This problem appears to be already affecting the deserialization of learning rate schedulers regardless of this patch.
Copy link
Member

@WindQAQ WindQAQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Generally LGTM :-) Thanks for the contribution.

Copy link
Member

@WindQAQ WindQAQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

@WindQAQ WindQAQ merged commit d32a099 into tensorflow:master Jul 8, 2020
ashutosh1919 pushed a commit to ashutosh1919/addons that referenced this pull request Jul 12, 2020
…#1974)

* Add support for scheduled weight decays in RectifiedAdam.

RAdam implements weight decay based on AdamW, and the latter supports scheduling
for both learning rate and weight decays as part of its warm restarts version.

This patch extends existing support of Keras schedulers for the learning rate to
weight decay, matching the weight decay features of AdamW.

* Fix code style issues.

* Fix deserialization of schedulers.

This problem appears to be already affecting the deserialization of learning rate schedulers regardless of this patch.

* Fix comparison when using the optimizer inside a tf.function.
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
…#1974)

* Add support for scheduled weight decays in RectifiedAdam.

RAdam implements weight decay based on AdamW, and the latter supports scheduling
for both learning rate and weight decays as part of its warm restarts version.

This patch extends existing support of Keras schedulers for the learning rate to
weight decay, matching the weight decay features of AdamW.

* Fix code style issues.

* Fix deserialization of schedulers.

This problem appears to be already affecting the deserialization of learning rate schedulers regardless of this patch.

* Fix comparison when using the optimizer inside a tf.function.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

RectifiedAdam is not supporting scheduling for weight decays, while AdamW does
4 participants