Skip to content

uctb/TSFM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

9 Commits
Β 
Β 

Repository files navigation

Time Series Foundation Model

This repo focus on progress on Time Series Foundation Model.

Survey&Benchmark

2024

  • A Survey of Deep Learning and Foundation Models for Time Series Forecasting. Miller(University of Georgia), John A., Mohammed Aldosari, Farah Saeed, Nasid Habib Barna, Subas Rana, I. Budak Arpinar, and Ninghao Liu. link πŸ”—18
  • Foundation Models for Time Series Analysis: A Tutorial and Survey. Yuxuan Liang(The Hong Kong University of Science and Technology(Guangzhou), Haomin Wen(Beijing Jiaotong University)) link πŸ”—26
  • FoundTS: Comprehensive and Unified Benchmarking of Foundation Models for Time Series Forecasting. Zhe Li, Xiangfei Qiu, Peng Chen, Yihang Wang, Hanyin Cheng, Yang Shu, Jilin Hu, Chenjuan Guo, Aoying Zhou, Qingsong Wen, Christian S. Jensen, Bin Yang link [code]
  • GIFT-Eval: A Benchmark For General Time Series Forecasting Model Evaluation. Taha Aksu, Gerald Woo, Juncheng Liu, Xu Liu, Chenghao Liu, Silvio Savarese, Caiming Xiong, Doyen Sahoo. link

Work

2024

  • UniMTS: Unified Pre-training for Motion Time Series. Xiyuan Zhang, Diyan Teng, Ranak Roy Chowdhury, Shuheng Li, Dezhi Hong, Rajesh K. Gupta, Jingbo Shang. link πŸ”—14 code
  • UNITS: A Unified Multi-Task Time Series Model. Shanghua Gao, Teddy Koker, Owen Queen, Thomas Hartvigsen, Theodoros Tsiligkaridis, Marinka Zitnik. link πŸ”—14 code
  • TIME-FFM: Towards LM-Empowered Federated Foundation Model for Time Series Forecasting. Qingxiang Liu, Xu Liu, Chenghao Liu, Qingsong Wen, Yuxuan Liang. link πŸ”—1
  • S^2IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting. Zijie Pan, Yushan Jiang, Sahil Garg, Anderson Schneider, Yuriy Nevmyvaka, Dongjin Song. link πŸ”—10 code
  • ROSE: Register Assisted General Time Series Forecasting with Decomposed Frequency Learning Yihang Wang, Yuying Qiu, Peng Chen, Kai Zhao, Yang Shu, Zhongwen Rao, Lujia Pan, Bin Yang, Chenjuan Guo. link πŸ”—0
  • In-Context Fine-Tuning for Time-Series Foundation Models Abhimanyu Das(Google Research), Matthew Faw, Rajat Sen, Yichen Zhou. link πŸ”—0
  • Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts Xu Liu(Saleforce AI Research, National University of Singapore), Juncheng Liu, Gerald Woo, Taha Aksu, Yuxuan Liang, Roger Zimmermann, Chenghao Liu, Silvio Savarese, Caiming Xiong, Doyen Sahoo. link πŸ”—0 code
  • FoMo: A Foundation Model for Mobile Traffic Forecasting with Diffusion Model. Haoye Chai(Tsinghua University), Shiyuan Zhang, Xiaoqian Qi, Yong Li. link
  • Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts. Xiaoming Shi(princeton University), Shiyu Wang, Yuqi Nie, Dianqi Li, Zhou Ye, Qingsong Wen, and Ming Jin. link πŸ”—0 code
  • Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series. Ekambaram, Vijay(IBM Granite), Arindam Jati, Nam H. Nguyen, Pankaj Dayama, Chandra Reddy, Wesley M. Gifford, and Jayant Kalagnanam. link πŸ”—7 code
  • Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting. Rasul, Kashif, Arjun Ashok, Andrew Robert Williams, Hena Ghonia, Rishika Bhagwatkar, Arian Khorasani, Mohammad Javad Darvishi Bayazi et al. link πŸ”—15 code
  • Unified Training of Universal Time Series Forecasting Transformers. Woo, Gerald(Salesforce AI Research), Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, and Doyen Sahoo. link πŸ”—31 code
  • Chronos: Learning the Language of Time Series. Das, Abhimanyu(Google Research), Weihao Kong, Rajat Sen, and Yichen Zhou. link πŸ”—46 code
  • Moment: A family of open time-series foundation models. Mononito Goswami, Konrad Szafer, Arjun Choudhry, Yifu Cai, Shuo Li, Artur Dubrawski. link πŸ”—22 code
  • Timer: Generative Pre-trained Transformers Are Large Time Series Models. Yong Liu(Tsinghua University), Haoran Zhang, Chenyu Li, Xiangdong Huang, Jianmin Wang, Mingsheng Long. link πŸ”—4 code

2023

  • Large Language Models Are Zero Shot Time Series Forecasters. Nate Gruver(NYU), Marc Finzi(CMU), Shikai Qiu(NYU), and Andrew G. Wilson(NYU). link πŸ”—174 code
  • A decoder-only foundation model for time-series forecasting. Das, Abhimanyu(Google Research), Weihao Kong, Rajat Sen, and Yichen Zhou. link πŸ”—55 code
  • One Fits All: Power General Time Series Analysis by Pretrained LM. Tian Zhou, Peisong Niu, xue wang, Liang Sun, Rong Jin. link πŸ”—220 code

Dataset

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published