Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This pull request contains partial work towards robustifying the Distribution outputs of NeuralForecast models.
Before this PR, Base classes performed
self._normalization
andself._inv_normalization
before sending theoutsample_y
signal to the distribution losses. This processing had bad implications for the Poisson distribution because theself._inv_normalization
methods induce in edge cases negative values due to round errors, in addition to changing the signal type from integer to real.The PR improves
Poisson
with the following:BaseWindows.train_step
andBaseWindows.validation_step
to useoriginal_outsample_y
in the DistributionLoss rather than the processedoutsample_y
.TFT.train_step
andBaseWindows.validation_step
to useoriginal_outsample_y
in the DistributionLoss rather than the processedoutsample_y
.The PR improves
GMM/PMM
with the following:torch.logsumexp
intoGMM/PMM
.GMM/PMM
.GMM/PMM
.PMM
regularization.I document improvements in the
HierarchicalNetworks.ipynb
notebook that reaches SoTA CRPS withGMM/PMM
, and performs reasonably well withPoisson
.There is missing work on the
original_outsample_y
signal for theBaseRecurrent
andBaseMultivariate
classes.