We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using Pytorch's SequentialLR with Lightning leads to AttributeError: 'SequentialLR' object has no attribute 'optimizer' in _validate_scheduler_optimizer (pytorch_lightning/trainer/optimizers.py), which makes sense since SequentialLR does not have a self.optimizer attribute indeed (though on the user side this should not be visible and things should work)
AttributeError: 'SequentialLR' object has no attribute 'optimizer'
_validate_scheduler_optimizer
cc @SeanNaren @tchaton, this would be a nice way to repro MinGPTs schedule (warmup then exp down)
Instantiate torch.optim.lr_scheduler.SequentialLRas your scheduler in Lightning
torch.optim.lr_scheduler.SequentialLR
Works flawlessly
Pytorch 1.10
The text was updated successfully, but these errors were encountered:
hey @blefaudeux thanks for raising this. seems like we need to support SequentialLR and ChainedScheduler explictly :)
SequentialLR
ChainedScheduler
Sorry, something went wrong.
ok seems like it's fixed on PyTorch master for SequentialLR here: pytorch/pytorch#67406 for ChainedScheduler added an issue here: pytorch/pytorch#67601
closing this. feel free to reopen if required :)
rohitgr7
No branches or pull requests
馃悰 Bug
Using Pytorch's SequentialLR with Lightning leads to
AttributeError: 'SequentialLR' object has no attribute 'optimizer'
in_validate_scheduler_optimizer
(pytorch_lightning/trainer/optimizers.py), which makes sense since SequentialLR does not have a self.optimizer attribute indeed (though on the user side this should not be visible and things should work)cc @SeanNaren @tchaton, this would be a nice way to repro MinGPTs schedule (warmup then exp down)
To Reproduce
Instantiate
torch.optim.lr_scheduler.SequentialLR
as your scheduler in LightningExpected behavior
Works flawlessly
Environment
Pytorch 1.10
The text was updated successfully, but these errors were encountered: