Open
Description
Describe the bug
When setting EncoderNormalizer
with the standard
method and no centering, the transformation is not what is expected. The encoder correctly set its .center_
to zero, but its .scale_
is not set to the std of the fitted values. It is set to the mean of the fitted values and hence can even be negative.
To Reproduce
from pytorch_forecasting.data.encoders import EncoderNormalizer
import torch
# generate a random sequence with mean -0.5 and unit std
x = torch.randn(60) - 0.5
print('x mean and std:', x.mean(), x.std())
# create a standard encoder with centering
encoder = EncoderNormalizer(
method='standard',
center=True,
max_length=None,
transformation=None,
method_kwargs={}
)
# fit transform
y = encoder.fit_transform(x)
# center_ and scale_ match mean and std of x
print('center_ and scale_:', encoder.center_, encoder.scale_)
# the transformed is what is expected
print('error:', (y - (x - x.mean()) / x.std()).abs().max())
# create a standard encoder with *NO* centering
encoder = EncoderNormalizer(
method='standard',
center=False,
max_length=None,
transformation=None,
method_kwargs={}
)
# fit transform
y = encoder.fit_transform(x)
# center_ is zero as expected, but scale_ *is not* the std of x, it is the *mean*
print('center_ and scale_:', encoder.center_, encoder.scale_)
# the transformed is *not* what is expected
print('error:', (y - x / x.std()).abs().max())
Expected behavior
In the centering=False
case, encoder.scale_
should be equal to x.std()
(=0.9360) and y
should be equal to x / x.std()
.
Additional context
In the code for the method _set_parameters()
of the class TorchNormalizer
, I don't understand the lines:
if not self.center and self.method != "identity":
self.scale_ = self.center_
Versions
1.4.0
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
Reproduced/confirmed