import torch
We define two kind of normalizations: component
and norm
.
component
normalization refers to tensors with each component of value around 1. More precisely, the second moment of each component is 1.
⟨xi2⟩ = 1
Examples:
[1.0, -1.0, -1.0, 1.0]
[1.0, 1.0, 1.0, 1.0]
the mean don't need to be zero[0.0, 2.0, 0.0, 0.0]
this is still fine because ∥x∥2 = n
torch.randn(10)
norm
normalization refers to tensors of norm close to 1.
∥x∥ ≈ 1
Examples:
[0.5, -0.5, -0.5, 0.5]
[0.5, 0.5, 0.5, 0.5]
the mean don't need to be zero[0.0, 1.0, 0.0, 0.0]
torch.randn(10) / 10**0.5
There is just a factor
Assuming that the weights distribution obey
⟨wi⟩ = 0
⟨wiwj⟩ = σ2δij
It imply that the two first moments of x ⋅ w (and therefore mean and variance) are only function of the second moment of x
You can use e3nn.util.test.assert_normalized
to check whether a function or module is normalized at initialization:
from e3nn.util.test import assert_normalized
from e3nn import o3
assert_normalized(o3.Linear("10x0e", "10x0e"))