-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
annotated_deep_learning_paper_implementations/labml_nn/normalization/deep_norm/__init__.py
Line 105 in 25ad4d6
| self.layer_norm = LayerNorm(normalized_shape, eps=eps, elementwise_affine=elementwise_affine) |
The self.layer_norm is not used in the forward function. And the self.alpha should be multiplied by x, not gx.
The correct forward function might be:
def forward(self, x: torch.Tensor, gx: torch.Tensor):
return self.layer_norm(self.alpha * x + gx)Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working