Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding AdaMod Optimizer in Tensorflow 2.x #35856

Open
wants to merge 2 commits into
base: master
from

Conversation

@monk1337
Copy link

monk1337 commented Jan 14, 2020

System information

TensorFlow version (you are using): 2.0
Are you willing to contribute it (Yes/No): Yes

Describe the feature and the current behavior/state.

I am looking to add one new optimizer called "Adamod" . Adamod was proposed by Jianbang Ding in the paper "An Adaptive and Momental Bound Method for Stochastic Learning"
AdaMod uses a new parameter called “Beta3”. The results are improved convergence, no need for warmup, and less sensitivity to the actual learning rate chosen. Beta3 parameter controls the degree of memory.

Will this change the current api? How?
I 've edited optimizers.py and created a new file adamod.py which is compatible with Tensorflow 2.x.
Usage :

model.compile(optimizer='adamod',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])
monk1337 added 2 commits Jan 14, 2020
@tensorflow-bot tensorflow-bot bot added the size:L label Jan 14, 2020
@googlebot googlebot added the cla: yes label Jan 14, 2020
@gbaned gbaned self-assigned this Jan 14, 2020
@gbaned gbaned added the comp:keras label Jan 14, 2020
@gbaned gbaned added this to Assigned Reviewer in PR Queue via automation Jan 14, 2020
@gbaned gbaned requested a review from tanzhenyu Jan 14, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
PR Queue
  
Assigned Reviewer
3 participants
You can’t perform that action at this time.