torch.Tensor for optimizer parameters #127699
Labels
module: optimizer
Related to torch.optim
needs design
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
🚀 The feature, motivation and pitch
The LR allows torch.Tensor, see: #120934 (comment)
But the other parameters don't, such as AdamW's beta2.
This causes torch.compile to fail (fallback to eager) when I compile an optimizer with changing parameters, since torch.compile does not allow non-constant floats.
Allowing changing the other (non-LR) parameters would be helpful because:
Alternatives
Either
torch.compile
working with floats orAdamW
allowing torch.tensor params would work for me.Additional context
No response
cc @vincentqb @jbschlosser @albanD @janeyx99 @crcrpar
The text was updated successfully, but these errors were encountered: