-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom Optimizer and LR schedulers #290
Conversation
Coverage reportThe coverage rate went from
Diff Coverage details (click to unfold)src/renate/updaters/experimental/repeated_distill.py
src/renate/utils/module.py
src/renate/shift/detector.py
src/renate/updaters/experimental/fine_tuning.py
src/renate/updaters/experimental/gdumb.py
src/renate/utils/optimizer.py
src/renate/cli/parsing_functions.py
src/renate/cli/run_training.py
src/renate/updaters/experimental/offline_er.py
src/renate/updaters/model_updater.py
src/renate/updaters/experimental/joint.py
src/renate/updaters/learner.py
src/renate/defaults.py
src/renate/updaters/avalanche/model_updater.py
src/renate/memory/storage.py
src/renate/updaters/experimental/er.py
src/renate/utils/distributed_strategies.py
src/renate/shift/ks_detector.py
|
lr_scheduler_plugin = None | ||
optimizers_scheduler = self._dummy_learner.configure_optimizers() | ||
if isinstance(optimizers_scheduler, tuple): | ||
optimizer, scheduler_config = optimizers_scheduler |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do these two lines do?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Return value can either be Tuple[Optimizer, Scheduler] or Optimizer. We are checking for this case and conditionally add a LRCallback.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The self.save_hyperparameters must now include the optimizer object.
@@ -421,6 +290,8 @@ def get_simple_updater( | |||
metric=None, | |||
deterministic_trainer=False, | |||
): | |||
if learner_kwargs is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why this form?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand. Are you asking why we compare with is None
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. Is there a test that populates different learner_kwargs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, e.g., test_joint.py L27. This change is not changing the behavior but rather avoiding the unwanted behavior when using dicts and the likes as defaults in functions.
learning_rate_scheduler_step_size: int = defaults.LEARNING_RATE_SCHEDULER_STEP_SIZE, | ||
momentum: float = defaults.MOMENTUM, | ||
weight_decay: float = defaults.WEIGHT_DECAY, | ||
learning_rate_scheduler: Optional[partial] = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Type Callable?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
changed it
I've also ignored the lr_scheduler |
@@ -47,6 +47,11 @@ instantiating the method you selected. See :doc:`supported_algorithms` for more | |||
|
|||
|
|||
|
|||
.. note:: | |||
If you have defined the ``optimizer_fn`` function in your Renate config, do not pass values for the keys |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a check for this in the code?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. run_training_job.py
will not add these params in the argparse
code and therefore the script will fail if these arguments are passed but not expected.
Optimizer and LR scheduler can be defined via Renate config file.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.