Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Yogi optimizer description, reference, arg doc strings to Optax #90

Merged
merged 2 commits into from
Apr 7, 2021

Conversation

8bitmp3
Copy link
Contributor

@8bitmp3 8bitmp3 commented Apr 4, 2021

This PR adds a short description, a paper reference, and arg doc strings to the Yogi optimizer in Optax, since it was not documented. Related to #71 by @mtthss

  ...
  Yogi is an adaptive optimiser, which provides control in tuning the effective
  learning rate to prevent it from increasing. By doing so, it focuses on
  addressing the issues of convergence and generalisation in exponential moving
  average-based adaptive methods (such as Adam and RMSprop). Yogi is a
  modification of Adam and uses the same parameters.
  ...

Used Google Scholar/Research to obtain a short link for the paper: http://www.sanjivk.com/yogi_nips2018.pdf. The alternative is https://papers.nips.cc/paper/2018/file/90365351ccc7437a1309dc64e4db32a3-Paper.pdf but it may not pass the lint test @mtthss.

@google-cla google-cla bot added the cla: yes copybara label for automatic import label Apr 4, 2021
@copybara-service copybara-service bot merged commit 63b564e into google-deepmind:master Apr 7, 2021
@8bitmp3 8bitmp3 deleted the patch-1 branch April 7, 2021 12:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: yes copybara label for automatic import
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants