-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added optional initial_lr parameter to LearningRateScheduleCallback and updated docs #1933
Conversation
…nd updated docs Signed-off-by: Travis Addair <taddair@uber.com>
Signed-off-by: Travis Addair <taddair@uber.com>
examples/keras_imagenet_resnet50.py
Outdated
@@ -147,10 +147,11 @@ | |||
hvd.callbacks.LearningRateWarmupCallback(warmup_epochs=args.warmup_epochs, verbose=verbose), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pass initial_lr to LearningRateWarmupCallback
as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch! Fixed.
Signed-off-by: Travis Addair <taddair@uber.com>
Signed-off-by: Travis Addair <taddair@uber.com>
@@ -99,9 +99,14 @@ def __init__(self, multiplier, start_epoch=0, end_epoch=None, staircase=True, | |||
steps_per_epoch: The callback will attempt to autodetect number of batches per | |||
epoch with Keras >= 2.0.0. Provide this value if you have an older | |||
version of Keras. | |||
initial_lr: Initial learning rate at the start of training. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the doc above for this callback, could we explicitly mention that when using a constant multiplier, it applies to the initial_lr
and not the current learning rate? (because different behavior than TF Keras 2.2 callback which takes current lr)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same for TF Keras file
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nevermind, I missed the description for class at
LearningRateScheduleCallback sets learning rate between epochs `start_epoch` and |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks good!
@@ -99,9 +99,14 @@ def __init__(self, multiplier, start_epoch=0, end_epoch=None, staircase=True, | |||
steps_per_epoch: The callback will attempt to autodetect number of batches per | |||
epoch with Keras >= 2.0.0. Provide this value if you have an older | |||
version of Keras. | |||
initial_lr: Initial learning rate at the start of training. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nevermind, I missed the description for class at
LearningRateScheduleCallback sets learning rate between epochs `start_epoch` and |
Fixes #1921.
Adds deprecation warnings with removed versions.
Fixes docs for SyncBatchNorm.
Signed-off-by: Travis Addair taddair@uber.com