Skip to content

PyTorch code for ETSformer: Exponential Smoothing Transformers for Time-series Forecasting

License

Notifications You must be signed in to change notification settings

salesforce/ETSformer

ETSformer: Exponential Smoothing Transformers for Time-series Forecasting



Figure 1. Overall ETSformer Architecture.

Official PyTorch code repository for the ETSformer paper. Check out our blog post!

  • ETSformer is a novel time-series Transformer architecture which exploits the principle of exponential smoothing in improving Transformers for timeseries forecasting.
  • ETSformer is inspired by the classical exponential smoothing methods in time-series forecasting, leveraging the novel exponential smoothing attention (ESA) and frequency attention (FA) to replace the self-attention mechanism in vanilla Transformers, thus improving both accuracy and efficiency.

Requirements

  1. Install Python 3.8, and the required dependencies.
  2. Required dependencies can be installed by: pip install -r requirements.txt

Data

  • Pre-processed datasets can be downloaded from the following links, Tsinghua Cloud or Google Drive, as obtained from Autoformer's GitHub repository.
  • Place the downloaded datasets into the dataset/ folder, e.g. dataset/ETT-small/ETTm2.csv.

Usage

  1. Install the required dependencies.
  2. Download data as above, and place them in the folder, dataset/.
  3. Train the model. We provide the experiment scripts of all benchmarks under the folder ./scripts, e.g. ./scripts/ETTm2.sh. You might have to change permissions on the script files by runningchmod u+x scripts/*.
  4. The script for grid search is also provided, and can be run by ./grid_search.sh.

Acknowledgements

The implementation of ETSformer relies on resources from the following codebases and repositories, we thank the original authors for open-sourcing their work.

Citation

Please consider citing if you find this code useful to your research.

@article{woo2022etsformer,
    title={ETSformer: Exponential Smoothing Transformers for Time-series Forecasting},
    author={Gerald Woo and Chenghao Liu and Doyen Sahoo and Akshat Kumar and Steven C. H. Hoi},
    year={2022},
    url={https://arxiv.org/abs/2202.01381},
}