This repo focus on progress on Time Series Foundation Model.
- A Survey of Deep Learning and Foundation Models for Time Series Forecasting. Miller(University of Georgia), John A., Mohammed Aldosari, Farah Saeed, Nasid Habib Barna, Subas Rana, I. Budak Arpinar, and Ninghao Liu. link π18
- Foundation Models for Time Series Analysis: A Tutorial and Survey. Yuxuan Liang(The Hong Kong University of Science and Technology(Guangzhou), Haomin Wen(Beijing Jiaotong University)) link π26
- FoundTS: Comprehensive and Unified Benchmarking of Foundation Models for Time Series Forecasting. Zhe Li, Xiangfei Qiu, Peng Chen, Yihang Wang, Hanyin Cheng, Yang Shu, Jilin Hu, Chenjuan Guo, Aoying Zhou, Qingsong Wen, Christian S. Jensen, Bin Yang link [code]
- GIFT-Eval: A Benchmark For General Time Series Forecasting Model Evaluation. Taha Aksu, Gerald Woo, Juncheng Liu, Xu Liu, Chenghao Liu, Silvio Savarese, Caiming Xiong, Doyen Sahoo. link
- UniMTS: Unified Pre-training for Motion Time Series. Xiyuan Zhang, Diyan Teng, Ranak Roy Chowdhury, Shuheng Li, Dezhi Hong, Rajesh K. Gupta, Jingbo Shang. link π14 code
- UNITS: A Unified Multi-Task Time Series Model. Shanghua Gao, Teddy Koker, Owen Queen, Thomas Hartvigsen, Theodoros Tsiligkaridis, Marinka Zitnik. link π14 code
- TIME-FFM: Towards LM-Empowered Federated Foundation Model for Time Series Forecasting. Qingxiang Liu, Xu Liu, Chenghao Liu, Qingsong Wen, Yuxuan Liang. link π1
- S^2IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting. Zijie Pan, Yushan Jiang, Sahil Garg, Anderson Schneider, Yuriy Nevmyvaka, Dongjin Song. link π10 code
- ROSE: Register Assisted General Time Series Forecasting with Decomposed Frequency Learning Yihang Wang, Yuying Qiu, Peng Chen, Kai Zhao, Yang Shu, Zhongwen Rao, Lujia Pan, Bin Yang, Chenjuan Guo. link π0
- In-Context Fine-Tuning for Time-Series Foundation Models Abhimanyu Das(Google Research), Matthew Faw, Rajat Sen, Yichen Zhou. link π0
- Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts Xu Liu(Saleforce AI Research, National University of Singapore), Juncheng Liu, Gerald Woo, Taha Aksu, Yuxuan Liang, Roger Zimmermann, Chenghao Liu, Silvio Savarese, Caiming Xiong, Doyen Sahoo. link π0 code
- FoMo: A Foundation Model for Mobile Traffic Forecasting with Diffusion Model. Haoye Chai(Tsinghua University), Shiyuan Zhang, Xiaoqian Qi, Yong Li. link
- Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts. Xiaoming Shi(princeton University), Shiyu Wang, Yuqi Nie, Dianqi Li, Zhou Ye, Qingsong Wen, and Ming Jin. link π0 code
- Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series. Ekambaram, Vijay(IBM Granite), Arindam Jati, Nam H. Nguyen, Pankaj Dayama, Chandra Reddy, Wesley M. Gifford, and Jayant Kalagnanam. link π7 code
- Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting. Rasul, Kashif, Arjun Ashok, Andrew Robert Williams, Hena Ghonia, Rishika Bhagwatkar, Arian Khorasani, Mohammad Javad Darvishi Bayazi et al. link π15 code
- Unified Training of Universal Time Series Forecasting Transformers. Woo, Gerald(Salesforce AI Research), Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, and Doyen Sahoo. link π31 code
- Chronos: Learning the Language of Time Series. Das, Abhimanyu(Google Research), Weihao Kong, Rajat Sen, and Yichen Zhou. link π46 code
- Moment: A family of open time-series foundation models. Mononito Goswami, Konrad Szafer, Arjun Choudhry, Yifu Cai, Shuo Li, Artur Dubrawski. link π22 code
- Timer: Generative Pre-trained Transformers Are Large Time Series Models. Yong Liu(Tsinghua University), Haoran Zhang, Chenyu Li, Xiangdong Huang, Jianmin Wang, Mingsheng Long. link π4 code
- Large Language Models Are Zero Shot Time Series Forecasters. Nate Gruver(NYU), Marc Finzi(CMU), Shikai Qiu(NYU), and Andrew G. Wilson(NYU). link π174 code
- A decoder-only foundation model for time-series forecasting. Das, Abhimanyu(Google Research), Weihao Kong, Rajat Sen, and Yichen Zhou. link π55 code
- One Fits All: Power General Time Series Analysis by Pretrained LM. Tian Zhou, Peisong Niu, xue wang, Liang Sun, Rong Jin. link π220 code