Skip to content

valeman/OutsmartingTime

 
 

Repository files navigation

Outsmarting Time:

Foundation Models for Zero-Shot Forecasting

This repository contains the code, data, and resources used in our thesis, which explores the application of transformer-based models in time series forecasting. Specifically, we benchmark the performance of three models—Chronos, TimeGPT, and Moirai—within the M3-Competition.

Abstract

This thesis evaluates the performance of three foundation models – Chronos, TimeGPT and Moirai – in the context of the M3-Competition. The models are based on the Transformer architecture, a wildly successful architecture in the field of Artificial Intelligence for both Large Language Models and Computer Vision. We evaluate the foundation models in zero-shot forecasting, as the models are not fitted to the data before inference, against the original competitors in the competition.

Our evaluation shows promise for using foundation models for time series forecasting. Overall, Chronos outperforms the two other foundation models and places fifth in our ranking of the 23 models for all data. The foundation models display better results for monthly data than the other time frequencies in the dataset, with Chronos being the top performer of all evaluated models for this subset. Our findings suggest that foundation models can be a viable approach to forecasting, and we recommend considering Chronos as the preferred model for forecasting monthly data.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Jupyter Notebook 97.4%
  • TeX 2.5%
  • Python 0.1%