Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: CosineAnnealingLR is missing #757

Merged
merged 9 commits into from Nov 15, 2021
6 changes: 5 additions & 1 deletion pl_bolts/models/self_supervised/moco/moco2_module.py
Expand Up @@ -307,7 +307,11 @@ def configure_optimizers(self):
momentum=self.hparams.momentum,
weight_decay=self.hparams.weight_decay,
)
return optimizer
scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(
optimizer,
self.trainer.max_epochs,
)
return [optimizer], [scheduler]

@staticmethod
def add_model_specific_args(parent_parser):
Expand Down