Skip to content

v1.4.0

Compare
Choose a tag to compare
@AzulGarza AzulGarza released this 14 Feb 22:25
· 205 commits to main since this release

New Models

  • Temporal Convolution Network (TCN)
  • AutoNBEATSx
  • AutoTFT (Transformers)

New features

  • Recurrent models (RNN, LSTM, GRU, DilatedRNN) can now take static, historical, and future exogenous variables. These variables are combined with lags to produce "context" vectors based on MLP decoders, based on the MQ-RNN model (https://arxiv.org/pdf/1711.11053.pdf).

  • The new DistributionLoss class allows for producing probabilistic forecasts with all available models. By changing the loss hyperparameter to one of these losses, the model will learn and output the distribution parameters:

    • Bernoulli, Poisson, Normal, StudentT, Negative Binomial, and Tweedie distributions
    • Scale-decoupled optimization using Temporal Scalers to improve convergence and performance.
    • The predict method can return samples, quantiles, or distribution parameters.
  • sCRPS loss in PyTorch to minimize errors generating prediction intervals.

Optimization improvements

We included new optimization features commonly used to train neural models:

  • Added learning rate scheduler, using torch.optim.lr_scheduler.StepLR scheduler. The new num_lr_decays hyperparameter controls the number of decays (evenly distributed) during training.
  • Added Early stopping using validation loss. The new early_stop_patience_steps controls the number of validation steps with no improvement after which training will be stopped.
  • New validation loss hyperparameter to allow different train and validation losses

Training, scheduler, validation loss computation, and early stopping are now defined in steps (instead of epochs) to control the training procedure better. Use max_steps to define the number of training iterations. Note: max_epochs will be deprecated in the future.

New tutorials and documentation

  • Probabilistic Long-horizon forecasting
  • Save and Load Models to use them in different datasets
  • Temporal Fusion Transformer
  • Exogenous variables
  • Automatic hyperparameter tuning
  • Intermittent or Sparse Time Series
  • Detect Demand Peaks