-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Gluon trainer updates: add learning_rate and lr_scheduler properties and add setter for learning rate #7659
Conversation
t commit -m 'sync'
@@ -113,6 +113,36 @@ def _init_kvstore(self): | |||
|
|||
self._kv_initialized = True | |||
|
|||
|
|||
@property | |||
def learning_rate(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
document this as
Properties
----------
in init doc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also report learning_rate when using lr_sheduler
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
python/mxnet/gluon/trainer.py
Outdated
|
||
|
||
@property | ||
def lr_scheduler(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
don't expose this for now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
python/mxnet/gluon/trainer.py
Outdated
|
||
@property | ||
def learning_rate(self): | ||
return self._optimizer.lr |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This kind of implementation couples the two classes together (i.e. Trainer must know the structure of optimizer). Instead add accessors in optimizer and make reading learning rate the concern of optimizer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
python/mxnet/gluon/trainer.py
Outdated
"learning rate only when the LRScheduler of" | ||
"the optimizer is undefined.") | ||
else: | ||
self._optimizer.lr = lr |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as the accessor comment. Try make setting learning rate the concern of optimizer's.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
Dudes, thank you for your comments. Code is updated. Let me know if you spot any further issues. |
python/mxnet/optimizer.py
Outdated
@@ -191,6 +204,24 @@ def update(self, index, weight, grad, state): | |||
""" | |||
raise NotImplementedError() | |||
|
|||
def set_learning_rate(self, lr): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@learning_rate.setter
def set_learning_rate
and then you can do
optimizer.learning_rate = 0.5
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
python/mxnet/gluon/trainer.py
Outdated
raise UserWarning("Optimizer has to be defined before its learning" | ||
"rate is mutated.") | ||
else: | ||
self._optimizer.set_learning_rate(lr) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update according to the comment below
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
python/mxnet/gluon/trainer.py
Outdated
---------- | ||
Learning_rate: float | ||
The learning rate of the optimizer or the learning rate of the | ||
LRScheduler of the optimizer if the LRScheduler is defined. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
explain it can be accessed with trainer.learning_rate and set with trainer.learning_rate = xxx
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
All tests passed except R GPU, @thirdwing is fixing this in #7686. |
tests/python/unittest/test_loss.py
Outdated
@@ -70,15 +70,14 @@ def test_ce_loss(): | |||
label = mx.nd.array(np.random.randint(0, nclass, size=(N,)), dtype='int32') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Submit test fix in a separate PR and revert the change on file property
100644 → 100755
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
The unit test patch is in a separate pr as suggested by @piiswrong |
Closing this PR due to new PR at #7760 |
No description provided.