Skip to content

Commit

Permalink
Remove redundant note in AdamW
Browse files Browse the repository at this point in the history
  • Loading branch information
OverLordGoldDragon committed Oct 29, 2019
1 parent 606ad07 commit 3d6a7cb
Showing 1 changed file with 0 additions and 2 deletions.
2 changes: 0 additions & 2 deletions keras_adamw/optimizers.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,6 @@ class AdamW(Optimizer):
init_verbose: bool. If True, print weight-name--weight-decay, and
lr-multiplier--layer-name value pairs set during
optimizer initialization (recommended)
*NOTE*: add below line before model.save when disabling eager execution:
tf.compat.v1.experimental.output_all_intermediates(True) # bug workaround
# <1> - if using 'warm restarts', then refers to total expected iterations
for a given restart; can be an estimate, and training won't stop
Expand Down

0 comments on commit 3d6a7cb

Please sign in to comment.