Skip to content

Commit

Permalink
fix of adam optimizer as pointed in #143
Browse files Browse the repository at this point in the history
Summary: fix for #143

Reviewed By: syhw, jacobkahn

Differential Revision: D22757150

fbshipit-source-id: 1c9267a7a594f7d3b7041389b7e40715cb14e6b4
  • Loading branch information
Tatiana Likhomanenko authored and facebook-github-bot committed Jul 27, 2020
1 parent 4f159fa commit e62eb7e
Showing 1 changed file with 5 additions and 6 deletions.
11 changes: 5 additions & 6 deletions flashlight/optim/AdamOptimizer.cpp
Expand Up @@ -52,6 +52,11 @@ AdamOptimizer::AdamOptimizer(
}

void AdamOptimizer::step() {
count_++;
float correctedBias1 = 1 - std::pow(beta1_, count_);
float correctedBias2 = 1 - std::pow(beta2_, count_);
float correctedLr = lr_ * std::sqrt(correctedBias2) / correctedBias1;

for (size_t i = 0; i < parameters_.size(); i++) {
if (!parameters_[i].isGradAvailable()) {
continue;
Expand All @@ -74,12 +79,6 @@ void AdamOptimizer::step() {
af::eval(biasedFirst);
af::eval(biasedSecond);

count_++;

float correctedBias1 = 1 - std::pow(beta1_, count_);
float correctedBias2 = 1 - std::pow(beta2_, count_);
float correctedLr = lr_ * std::sqrt(correctedBias2) / correctedBias1;

data = data - (correctedLr * biasedFirst) / (af::sqrt(biasedSecond) + eps_);

af::eval(data);
Expand Down

0 comments on commit e62eb7e

Please sign in to comment.