Skip to content

Commit

Permalink
Merge pull request #4970 from toslunar/bp-4895-adam_alpha
Browse files Browse the repository at this point in the history
[backport] Fix Adam alpha argument explanation
  • Loading branch information
okuta committed Jun 23, 2018
2 parents a5dbf22 + 7dcfdba commit 29cf807
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions chainer/optimizers/adam.py
Expand Up @@ -50,7 +50,7 @@ class AdamRule(optimizer.UpdateRule):
Args:
parent_hyperparam (~chainer.optimizer.Hyperparameter): Hyperparameter
that provides the default values.
alpha (float): Step size.
alpha (float): Coefficient of learning rate.
beta1 (float): Exponential decay rate of the first order moment.
beta2 (float): Exponential decay rate of the second order moment.
eps (float): Small value for the numerical stability.
Expand Down Expand Up @@ -181,7 +181,7 @@ class Adam(optimizer.GradientMethod):
<https://openreview.net/forum?id=ryQu7f-RZ>`_
Args:
alpha (float): Step size.
alpha (float): Coefficient of learning rate.
beta1 (float): Exponential decay rate of the first order moment.
beta2 (float): Exponential decay rate of the second order moment.
eps (float): Small value for the numerical stability.
Expand Down

0 comments on commit 29cf807

Please sign in to comment.