Skip to content

A curated list of state-of-the-art papers on deep learning for universal representations of time series.

License

Notifications You must be signed in to change notification settings

itouchz/awesome-deep-time-series-representations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

60 Commits
 
 
 
 

Repository files navigation

Awesome Deep Time-Series Representations

This is a repository to help all readers who are interested in learning universal representations of time series with deep learning. If your papers are missing or you have other requests, please post an issue, create a pull request, or contact patara.t@kaist.ac.kr. We will update this repository at a regular basis in accordance with the top-tier conference publication cycles to maintain up-to-date.

Next Batch: ICDM 2024, CIKM 2024, NeurIPS 2024

Accompanying Paper: Universal Time-Series Representation Learning: A Survey

@article{trirat2024universal,
  title={Universal Time-Series Representation Learning: A Survey},
  author={Patara Trirat and Yooju Shin and Junhyeok Kang and Youngeun Nam and Jihye Na and Minyoung Bae and Joeun Kim and Byunghyun Kim and Jae-Gil Lee},
  journal={arXiv preprint arXiv:2401.03717},
  year={2024}
}

Proposed Taxonomy

proposed taxonomy

Contents

Related Surveys (Latest Update: July, 2024)

Time-Series Data Mining and Analysis

Title Affiliation Venue Year
Discrete Wavelet Transform-based Time Series Analysis and Mining University of Maryland ACM CSUR 2011
Time-Series Data Mining IRCAM ACM CSUR 2012
A Review of Unsupervised Feature Learning and Deep Learning for Time-Series Modeling Ă–rebro University Pattern Recognition Letters 2014
Time-series clustering – A decade review University of Malaya Information Systems 2015
Deep Learning for Time-Series Analysis University of Kaiserslautern arXiv 2017
A survey of methods for time series change point detection Washington State University KAIS 2017
Survey on time series motif discovery Ostwestfalen-Lippe University of Applied Sciences WIDM 2017
Wavelet Transform Application for/in Non-Stationary Time-Series Analysis: A Review Ecole Nationale des Sciences de l’Informatique MDPI Applied Sciences 2019
Deep learning for time series classification: a review Université Haute Alsace Data Mining and Knowledge Discovery 2019
Anomaly Detection for IoT Time-Series Data: A Survey University of Keele IEEE IoT-J 2019
A Review of Deep Learning Methods for Irregularly Sampled Medical Time Series Data Peking University arXiv 2020
Approaches and Applications of Early Classification of Time Series: A Review Indian Institute of Technology (BHU) Varanasi IEEE TAI 2020
A Survey on Principles, Models and Methods for Learning from Irregularly Sampled Time Series University of Massachusetts Amherst NeurIPS Workshop on ML-RSA 2020
A Review of Deep Learning Models for Time Series Prediction Dalian University of Technology IEEE Sensors Journal 2021
An empirical survey of data augmentation for time series classification with neural networks Kyushu University PLOS ONE 2021
Time-series forecasting with deep learning: a survey University of Oxford Phil.Trans.R.Soc.A 2021
A Review on Outlier/Anomaly Detection in Time Series Data Basque Research and Technology Alliance ACM CSUR 2021
A Review of Time-Series Anomaly Detection Techniques: A Step to Future Perspectives University of Newcastle FICC 2021
Deep Learning for Anomaly Detection in Time-Series Data: Review, Analysis, and Guidelines Seoul National University IEEE Access 2021
Time Series Data Augmentation for Deep Learning: A Survey Alibaba Group IJCAI 2021
An Experimental Review on Deep Learning Architectures for Time Series Forecasting University of Sevilla IJNS 2021
Experimental Comparison and Survey of Twelve Time Series Anomaly Detection Algorithms Verint JAIR 2021
Causal inference for time series analysis: problems, methods and evaluation Arizona State University KAIS 2021
End-to-end deep representation learning for time series clustering: a comparative study Université de Haute Alsace Data Mining and Knowledge Discovery 2022
Survey and Evaluation of Causal Discovery Methods for Time Series Université Grenoble Alpes JAIR 2022
A Review of Recurrent Neural Network-Based Methods in Computational Physiology University of Pittsburgh IEEE TNNLS 2022
Deep Learning for Time Series Anomaly Detection: A Survey Monash University arXiv 2022
Deep Learning for Time Series Forecasting: Tutorial and Literature Survey Amazon Research ACM CSUR 2022
Transformers in Time Series: A Survey Alibaba Group IJCAI 2023
Deep Learning for Time Series Classification and Extrinsic Regression: A Current Survey Monash University arXiv 2023
Label-efficient Time Series Representation Learning: A Review Nanyang Technological University arXiv 2023
Neural Time Series Analysis with Fourier Transform: A Survey Beijing Institute of Technology arXiv 2023
A Survey on Dimensionality Reduction Techniques for Time-Series Data University of Colorado Boulder IEEE Access 2023
Long sequence time-series forecasting with deep learning: A survey Southwest Jiaotong University Information Fusion 2023
Data Augmentation techniques in time series domain: a survey and taxonomy Universidad Politécnica de Madrid Neural Computing & Applications 2023
Diffusion Models for Time Series Applications: A Survey University of Sydney arXiv 2023
A Survey on Time-Series Pre-Trained Models South China University of Technology arXiv 2023
Self-Supervised Contrastive Learning for Medical Time Series: A Systematic Review RMIT University MDPI Sensors 2023
Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects Zhejiang University IEEE TPAMI 2024
Unsupervised Representation Learning for Time Series: A Review Shandong University arXiv 2023
Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook Monash University arXiv 2023
Foundation Models for Time Series Analysis: A Tutorial and Survey The Hong Kong University of Science and Technology arXiv 2024
Large Language Models for Time Series: A Survey University of California, San Diego arxiv 2024
Empowering Time Series Analysis with Large Language Models: A Survey University of Connecticut arxiv 2024
A Survey of Time Series Foundation Models: Generalizing Time Series Representation with Large Language Model Hong Kong University of Science and Technology arxiv 2024
Position: What Can Large Language Models Tell Us about Time Series Analysis Griffith University, Chinese Academy of Sciences, The Hong Kong University of Science and Technology (Guangzhou) ICML 2024
Deep Time Series Models: A Comprehensive Survey and Benchmark Tsinghua University arXiv 2024

Representation Learning

Title Affiliation Venue Year
Representation Learning: A Review and New Perspectives University of Montreal IEEE TPAMI 2013
A Survey of Multi-View Representation Learning Zhejiang University IEEE TKDE 2019
Deep Multimodal Representation Learning: A Survey Fuzhou University IEEE Access 2019
A Survey on Representation Learning for User Modeling University of Georgia IJCAI 2020
A survey on deep geometry learning: From a representation perspective Chinese Academy of Sciences Computational Visual Media 2020
A Review on Deep Learning Approaches for 3D Data Representations in Retrieval and Classifications Xiamen University IEEE Access 2020
Contrastive Representation Learning: A Framework and Review Dublin City University IEEE Access 2020
Beyond Just Vision: A Review on Self-Supervised Representation Learning on Multimodal and Temporal Data RMIT University arXiv 2022
Self-Supervised Representation Learning: Introduction, advances, and challenges University of Edinburgh IEEE Signal Processing Magazine 2022
A Brief Overview of Universal Sentence Representation Methods: A Linguistic View KU Leuven ACM CSUR 2022
Network Representation Learning: From Preprocessing, Feature Extraction to Node Embedding Soochow University ACM CSUR 2022
Evaluation Methods for Representation Learning: A Survey University of Tokyo IJCAI 2022
Self-Supervised Speech Representation Learning: A Review Meta IEEE JSTSP 2022
A Survey on Hypergraph Representation Learning UniversitĂ  degli Studi di Torino ACM CSUR 2023
Representation learning for knowledge fusion and reasoning in Cyber–Physical–Social Systems: Survey and perspectives Hainan University Information Fusion 2023
Survey of Deep Representation Learning for Speech Emotion Recognition University of Southern Queensland IEEE TAFFC 2023
Graph Representation Learning and Its Applications: A Survey Catholic University of Korea MDPI Sensors 2023
Graph Representation Learning Meets Computer Vision: A Survey Xidian University IEEE TAI 2023
A Comprehensive Survey on Deep Graph Representation Learning Peking University arXiv 2023
Dynamic Graph Representation Learning with Neural Networks: A Survey University of Rouen Normandy arXiv 2023
Multiscale Representation Learning for Image Classification: A Survey Xidian University IEEE TAI 2023
A Survey on Protein Representation Learning: Retrospect and Prospect Westlake University arXiv 2023

Research Papers (Latest Update: KDD 2024)

Neural Architectural Approaches

Studies in this group focus on the novel design of neural architectures by combining basic building blocks or redesigning a neural architecture from scratch to improve the capability of capturing temporal dependencies and inter-relationships between variables of multivariate time series. We can further categorize the studies into the basic block combination and innovative redesign categories based on the degree of architecture adjustment.

Year Title Venue
2018 Learning representations of multivariate time series with missing data Pattern Recognition
2018 Multilevel Wavelet Decomposition Network for Interpretable Time Series Analysis KDD
2019 Latent ODEs for Irregularly-Sampled Time Series NeurIPS
2019 Learning Disentangled Representations of Satellite Image Time Series ECML PKDD
2019 Unsupervised Scalable Representation Learning for Multivariate Time Series NeurIPS
2019 Audio Word2vec: Sequence-to-Sequence Autoencoding for Unsupervised Learning of Audio Segmentation and Representation IEEE/ACM TASLP
2019 Towards Explainable Representation of Time-Evolving Graphs via Spatial-Temporal Graph Attention Networks CIKM
2020 A real-time action representation with temporal encoding and deep compression IEEE TCSVT
2020 End-to-End Incomplete Time-Series Modeling From Linear Memory of Latent Variables IEEE TCYB
2020 Memory-Augmented Dense Predictive Coding for Video Representation Learning ECCV
2020 Temporal Aggregate Representations for Long-Range Video Understanding ECCV
2021 Attentive Neural Controlled Differential Equations for Time-series Classification and Forecasting ICDM
2021 TE-ESN: Time Encoding Echo State Network for Prediction Based on Irregularly Sampled Time Series Data IJCAI
2021 SSAN: Separable Self-Attention Network for Video Representation Learning CVPR
2021 Multi-Time Attention Networks for Irregularly Sampled Time Series ICLR
2021 Time Series Domain Adaptation via Sparse Associative Structure Alignment AAAI
2021 A deep multi-task representation learning method for time series classification and retrieval Information Sciences
2021 A Transformer-based Framework for Multivariate Time Series Representation Learning KDD
2021 DeLTa: Deep local pattern representation for time-series clustering and classification using visual perception KBS
2021 TriBERT: Human-centric Audio-visual Representation Learning NeurIPS
2022 CrossPyramid: Neural Ordinary Differential Equations Architecture for Partially-observed Time-series arXiv
2022 EXIT: Extrapolation and Interpolation-based Neural Controlled Differential Equations for Time-series Classification and Forecasting WebConf
2022 Modeling Irregular Time Series with Continuous Recurrent Units ICML
2022 Towards Learning Disentangled Representations for Time Series KDD
2022 MARINA: An MLP-Attention Model for Multivariate Time-Series Analysis CIKM
2022 TARNet : Task-Aware Reconstruction for Time-Series Transformer KDD
2022 Weakly Paired Associative Learning for Sound and Image Representations via Bimodal Associative Memory CVPR
2022 Decoupling Local and Global Representations of Time Series AISTATS
2022 HyperTime: Implicit Neural Representations for Time Series NeurIPS (Workshop)
2022 TCGL: Temporal Contrastive Graph for Self-Supervised Video Representation Learning IEEE TIP
2022 Unsupervised Time-Series Representation Learning with Iterative Bilinear Temporal-Spectral Fusion ICML
2023 ContiFormer: Continuous-Time Transformer for Irregular Time Series Modeling NeurIPS
2023 TriD-MAE: A Generic Pre-trained Model for Multivariate Time Series with Missing Values CIKM
2023 Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series ICML
2023 Contrast Everything: A Hierarchical Contrastive Framework for Medical Time-Series NeurIPS
2023 One Fits All: Universal Time Series Analysis by Pretrained LM and Specially Designed Adaptors arXiv
2023 Multivariate Time Series Representation Learning via Hierarchical Correlation Pooling Boosted Graph Neural Network IEEE TAI
2023 One Fits All: Power General Time Series Analysis by Pretrained LM NeurIPS
2023 One Transformer for All Time Series: Representing and Training with Time-Dependent Heterogeneous Tabular Data arXiv
2023 Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations arXiv
2023 UniTS: A Universal Time Series Analysis Framework with Self-supervised Representation Learning arXiv
2023 Modeling Temporal Data as Continuous Functions with Stochastic Process Diffusion ICML
2023 TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis ICLR
2023 A Shapelet-based Framework for Unsupervised Multivariate Time Series Representation Learning VLDB
2023 A Multi-Scale Decomposition MLP-Mixer for Time Series Analysis arXiv
2023 Effectively Modeling Time Series with Simple Discrete State Spaces ICLR
2023 FEAT: A general framework for Feature-aware Multivariate Time-series Representation Learning KBS
2023 Learning Robust and Consistent Time Series Representations: A Dilated Inception-Based Approach arXiv
2023 Multi-Task Self-Supervised Time-Series Representation Learning arXiv
2023 WHEN: A Wavelet-DTW Hybrid Attention Network for Heterogeneous Time Series Analysis KDD
2023 Sparse Binary Transformers for Multivariate Time Series Modeling KDD
2024 NEWTIME: NUMERICALLY MULTI-SCALED EMBEDDING FOR LARGE-SCALE TIME SERIES PRETRAINING arXiv
2024 Fully-Connected Spatial-Temporal Graph for Multivariate Time-Series Data AAAI
2024 FITS: MODELING TIME SERIES WITH 10k PARAMETERS ICLR
2024 GAFORMER: ENHANCING TIMESERIES TRANSFORMERS THROUGH GROUP-AWARE EMBEDDINGS ICLR
2024 T-REP: REPRESENTATION LEARNING FOR TIME SERIES USING TIME-EMBEDDINGS ICLR
2024 UNITS: A Unified Multi-Task Time Series Model arXiv
2024 STABLE NEURAL STOCHASTIC DIFFERENTIAL EQUATIONS IN ANALYZING IRREGULAR TIME SERIES DATA ICLR
2024 CNN KERNELS CAN BE THE BEST SHAPELETS ICLR
2024 LEARNING TO EMBED TIME SERIES PATCHES INDEPENDENTLY ICLR
2024 CARD: Channel Aligned Robust Blend Transformer for Time Series Forecasting ICLR
2024 MODERNTCN: A MODERN PURE CONVOLUTION STRUCTURE FOR GENERAL TIME SERIES ANALYSIS ICLR
2024 UP2ME: Univariate Pre-training to Multivariate Fine-tuning as a General-purpose Framework for Multivariate Time Series Analysis ICML
2024 TSLANet: Rethinking Transformers for Time Series Representation Learning ICML
2024 Timer: Generative Pre-trained Transformers Are Large Time Series Models ICML
2024 MF-CLR: Multi-Frequency Contrastive Learning Representation for Time Series ICML
2024 MOMENT: A Family of Open Time-series Foundation Models ICML
2024 Self-Supervised Learning of Time Series Representation via Diffusion Process and Imputation-Interpolation-Forecasting Mask KDD

Learning-Focused Approaches

Studies in this category focus on devising novel objective functions or pretext tasks used for the representation learning process, i.e., model training. The learning objectives can be categorized into supervised, unsupervised, or self-supervised learning, depending on the use of labeled instances. The difference between unsupervised and self-supervised learning is the presence of pseudo labels. Specifically, unsupervised learning is based on the reconstruction of its input, while self-supervised learning uses pseudo labels as self-supervision signals.

Year Title Venue
2018 Random Warping Series: A Random Features Method for Time-Series Embedding AISTATS
2018 Sqn2Vec: Learning Sequence Representation via Sequential Patterns with a Gap Constraint ECML PKDD
2019 Learning Disentangled Representations of Satellite Image Time Series ECML PKDD
2019 Wave2Vec: Deep representation learning for clinical temporal data Neurocomputing
2019 Unsupervised Scalable Representation Learning for Multivariate Time Series NeurIPS
2019 Audio Word2vec: Sequence-to-Sequence Autoencoding for Unsupervised Learning of Audio Segmentation and Representation IEEE/ACM TASLP
2020 TimeAutoML: Autonomous Representation Learning for Multivariate Irregularly Sampled Time Series arXiv
2020 Learning Representations from Audio-Visual Spatial Alignment NeurIPS
2020 Cycle-Contrast for Self-Supervised Video Representation Learning NeurIPS
2020 End-to-End Incomplete Time-Series Modeling From Linear Memory of Latent Variables IEEE TCYB
2020 Self-supervised Video Representation Learning by Pace Prediction ECCV
2020 Memory-Augmented Dense Predictive Coding for Video Representation Learning ECCV
2021 Unsupervised Representation Learning for Time Series with Temporal Neighborhood Coding ICLR
2021 Long Short View Feature Decomposition via Contrastive Video Representation Learning ICCV
2021 A Transformer-based Framework for Multivariate Time Series Representation Learning KDD
2021 TriBERT: Human-centric Audio-visual Representation Learning NeurIPS
2021 Learning by aligning videos in time CVPR
2021 Representation Learning via Global Temporal Alignment and Cycle-Consistency CVPR
2021 Spatiotemporal Contrastive Video Representation Learning CVPR
2021 Time-Equivariant Contrastive Video Representation Learning ICCV
2021 Time-Series Representation Learning via Temporal and Contextual Contrasting IJCAI
2021 RSPNet: Relative Speed Perception for Unsupervised Video Representation Learning AAAI
2022 TS-Rep: Self-supervised time series representation learning from robot sensor data NeurIPS (Workshop)
2022 TARNet : Task-Aware Reconstruction for Time-Series Transformer KDD
2022 TimeCLR: A self-supervised contrastive learning framework for univariate time series representation KBS
2022 Weakly Paired Associative Learning for Sound and Image Representations via Bimodal Associative Memory CVPR
2022 Contrastive Spatio-Temporal Pretext Learning for Self-Supervised Video Representation AAAI
2022 Dual Contrastive Learning for Spatio-temporal Representation MM
2022 Self-Supervised Contrastive Pre-Training For Time Series via Time-Frequency Consistency NeurIPS
2022 Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation ICML (Workshop)
2022 TS2Vec: Towards Universal Representation of Time Series AAAI
2022 Cross-Architecture Self-supervised Video Representation Learning CVPR
2022 Learning from Untrimmed Videos: Self-Supervised Video Representation Learning with Hierarchical Consistency CVPR
2022 TransRank: Self-supervised Video Representation Learning via Ranking-based Transformation Recognition CVPR
2022 Frame-wise Action Representations for Long Videos via Sequence Contrastive Learning CVPR
2022 On Temporal Granularity in Self-Supervised Video Representation Learning BMVC
2022 Self-Supervised Spatiotemporal Representation Learning by Exploiting Video Continuity AAAI
2022 Hierarchically Decoupled Spatial-Temporal Contrast for Self-supervised Video Representation Learning WACV
2022 Self-supervised Video Representation Learning by Uncovering Spatio-temporal Statistics IEEE TPAMI
2022 TCGL: Temporal Contrastive Graph for Self-Supervised Video Representation Learning IEEE TIP
2022 TCLR: Temporal contrastive learning for video representation CVIU
2023 PrimeNet: Pre-Training for Irregular Multivariate Time Series AAAI
2023 Contrast Everything: A Hierarchical Contrastive Framework for Medical Time-Series NeurIPS
2023 SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling

About

A curated list of state-of-the-art papers on deep learning for universal representations of time series.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published