Navigation Menu

Skip to content
This repository has been archived by the owner on Aug 3, 2021. It is now read-only.

Commit

Permalink
Update automatic_loss_scaler.py
Browse files Browse the repository at this point in the history
Signed-off by "Oleksii Kuchaiev"
  • Loading branch information
okuchaiev committed May 22, 2019
1 parent 94e783e commit 27346d1
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions open_seq2seq/optimizers/automatic_loss_scaler.py
Expand Up @@ -62,7 +62,7 @@ def __init__(self, params):
},
)
self.scale_min = params.get('scale_min', 1.0)
self.scale_max = params.get('scale_max', 2.**24)
self.scale_max = params.get('scale_max', 2.**14)
self.step_factor = params.get('step_factor', 2.0)
self.step_window = params.get('step_window', 2000)

Expand Down Expand Up @@ -127,7 +127,7 @@ def __init__(self, params):
},
)
self.scale_min = params.get('scale_min', 1.0)
self.scale_max = params.get('scale_max', 2.**24)
self.scale_max = params.get('scale_max', 2.**14)
self.log_max = params.get('log_max', 16.)
self.beta1 = params.get('beta1', 0.99)
self.beta2 = params.get('beta2', 0.999)
Expand Down

0 comments on commit 27346d1

Please sign in to comment.