The primary objective of this research is to explore and reproduce results demonstrated in the articleTemporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting. The study critically evaluates three different algorithmic methods—ARIMA, Temporal Fusion Transformer (TFT), and NeuralProphet—on the same dataset. The experiments successfully replicate previous findings and demonstrate comparable performance, contributing to the understanding of time series forecasting with an emphasis on interpretability.
Alican Gündogdu, Benedikt Rein and Yuliya Vandysheva
Humboldt-University of Berlin
Chair of Information Systems
Course: Information Systems
Professor: Stephan Lessmann
- Evaluation of Advanced Forecasting Methods: In-depth analysis and comparison of ARIMA, TFT, and NeuralProphet algorithms.
- Focus on Interpretability: Emphasizes the importance of making complex time series forecasting models understandable and interpretable.
- Reproduction of Existing Research: Validates and expands upon the findings of a key article in the field, demonstrating the replicability of complex models.
This repository provides code for replicating the experiments described in the paper, which you can find under: Interpretable_timeseries_forecasting_Information_Science_Seminar.pdf
The code was developed on Linux but should run on other machines with minor changes.
Read the Notebook_interpretable_timeseries_forecasting.ipynp alongside the paper. In the notebook you can find some exploratory analysis of the electricity dataset and we present our best performing NeuralProphet and TemporalFusionTransformer models alongside our baseline.
Run bash setup.sh
to:
-download the needed dataset
-setup a conda enviroment
We cannot guarantee, that virtual enviroment setup works on every machnine, if you prefer venv or the setup fails, use the requirement.txt
so setup the enviroment.
With an activated virtual enviroment and requirements.txt
installed run:
python3 tft_electricity_hypertuning.py
to selecting optimal hyperparameters
python3 tft_electricity_google_normalizer.py
to run our TFT implementation with already tuned hyperparameters and copied normalization from the TFT paper
python3 tft_electricity_build_in_normalizer.py
to run our TFT implementation with already tuned hyperparameters and let the TFT module take care of normalization
python3 neuralprophet_electricity_build_in_normalizer.py
to run our NeuralProphet implementation without hyperparameter tuning and let the NeuralProphet module take care of normalization
python3 neuralprophet_electricity_google_normalizer.py
to run our NeuralProphet implementation without hyperparameter tuning and copied normalization from the TFT paper
python3 arima_electricity.py
to create an ARIMA model for every local timeseries with copied normalization from the TFT paper and safe the predictions to a csv, models are not saved.
Download the electricity dataset.
We encourage contributions to this repository, especially in the areas of improving model interpretability and efficiency.
Special thanks to Humboldt University Berlin and the Chair of Information Systems for the support and resources provided for this study. We also extend our gratitude to all the researchers and practitioners whose insights have shaped this field.