-
Notifications
You must be signed in to change notification settings - Fork 25.9k
Closed
Labels
module: nnRelated to torch.nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🚀 The feature, motivation and pitch
Currently, the RMSNorm module supports the elementwise_affine kwarg but not the bias kwarg. Would it be easy to implement a mode where elementwise_affine is set True, but bias can be set False, like LayerNorm?
This sort of configuration is fairly prevalent and is used by many modern LLM models including Llama(s), and Mamba(s).
Alternatives
No response
Additional context
No response
cc @albanD @mruberry @jbschlosser @walterddr @mikaylagawarecki
Metadata
Metadata
Assignees
Labels
module: nnRelated to torch.nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module