Skip to content

qianlima-lab/time-series-ptms

Repository files navigation

A Survey on Time-Series Pre-Trained Models

This is the training code for our paper "A Survey on Time-Series Pre-Trained Models"

Datasets

The datasets used in this project are as follows:

Pre-Trained Models on Time Series Classification

Usage (Transfer Learning)

  1. To pre-train a model on your own dataset, run
python train.py --dataroot [your UCR datasets directory] --task [type of pre-training task: classification or reconstruction] --dataset [name of the dataset you want to pretrain on] --backbone [fcn or dilated] --mode pretrain ...
  1. To finetune (classification) the model on a dataset, run
python train.py --dataroot [your UCR datasets directory] --dataset [name of the dataset you want to finetune on] --source_dataset [the dataset you pretrained on] --save_dir [the directory to save the pretrained weights] --mode finetune ...

run

python train.py -h

For detailed options and examples, please refer to scripts/transfer_pretrain_finetune.sh

Usage (Transformer and Contrastive Learning)

ID Method Architecture Year Press. Source Code
1 TS2Vec Contrastive Learning 2022 AAAI github-link
2 TS-TCC Contrastive Learning & Transformer 2021 IJCAI github-link
3 TST Transformer 2021 KDD github-link
4 Triplet-loss Contrastive Learning 2019 NeurIPS github-link
5 SelfTime Contrastive Learning 2021 Submitted to ICLR github-link
  1. Pre-training and classification using TS2Vec model on a UCR dataset, run
python train_tsm.py --dataroot [your UCR datasets directory] --normalize_way single ...

For detailed options and examples, please refer to ts2vec_cls/scripts/ts2vec_tsm_single_norm.sh

  1. Pre-training and classification using TS-TCC model on a UCR dataset, run
python main_ucr.py --dataset [name of the ucr dataset] --device cuda:0 --save_csv_name tstcc_ucr_ --seed 42;

For detailed options and examples, please refer to tstcc_cls/scripts/fivefold_tstcc_ucr.sh

  1. To pre-train and classification using TST model on a UCR dataset, run
python src/main.py --dataset [dataset name] --data_dir [path of the dataset] --batch_size [batch size] --task pretrain_and_finetune --epochs

To do classification task using Transformer encoder on a UCR dataset, run

python src/main.py --dataset [dataset name] --data_dir [path of the dataset] --batch_size [batch size] --task classification --epochs

For detailed options and examples for training on the full UCR128 dataset, please refer to tst_cls/scripts/pretrain_finetune.sh and tst_cls/scripts/classification.shor simply run

python src/main.py -h
  1. Pre-training and classification using Triplet-loss model on a UCR dataset, run
python ucr.py --dataset [name of the ucr dataset] --path [your UCR datasets directory] --hyper [hyperparameters file path(./default_hyperparameters.json for default option)] --cuda

For detailed options and examples, please refer to tloss_cls/scripts/ucr.sh

Pre-training and classification using Triplet-loss model on a UEA dataset, run

python uea.py --dataset [name of the uea dataset] --path [your UEA datasets directory] --hyper [hyperparameters file path(./default_hyperparameters.json for default option)] --cuda

For detailed options and examples, please refer to tloss_cls/scripts/uea.sh

  1. Pre-training and classification using SelfTime model on a UCR dataset, run
python -u train_ssl.py --dataset_name [dataset name] --model_name SelfTime --ucr_path [your UCR datasets directory] --random_seed 42

For detailed options and examples, please refer to selftime_cls/scripts/ucr.sh

Usage (Visualization)

  • To get the visualization of model's feature map, run
python visualize.py --dataroot [your dataset root] --dataset [dataset name] --backbone [encoder backbone] --graph [cam, heatmap or tsne] 
  • We provide weights of Wine and GunPoint dataset for quick start.

Pre-Trained Models on Time Series Forecasting

For details, please refer to ts_forecating_methods/README.

Pre-Trained Models on Time Series Anomaly Detection

For details, please refer to ts_anomaly_detection_methods/README.

About

A Survey on Time-Series Pre-Trained Models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published