Skip to content

XDZhelheim/STAEformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

STAEformer: Spatio-Temporal Adaptive Embedding Transformer

H. Liu*, Z. Dong*, R. Jiang#, J. Deng, J. Deng, Q. Chen, X. Song#, "Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer SOTA for Traffic Forecasting", Proc. of 32nd ACM International Conference on Information and Knowledge Management (CIKM), 2023. (*Equal Contribution, #Corresponding Author)

model_arch

Citation

@inproceedings{liu2023spatio,
  title={Spatio-temporal adaptive embedding makes vanilla transformer sota for traffic forecasting},
  author={Liu, Hangchen and Dong, Zheng and Jiang, Renhe and Deng, Jiewen and Deng, Jinliang and Chen, Quanjun and Song, Xuan},
  booktitle={Proceedings of the 32nd ACM International Conference on Information and Knowledge Management},
  pages={4125--4129},
  year={2023}
}

CIKM23 Proceedings (including METRLA, PEMSBAY, PEMS04, PEMS07, PEMS08 results)

https://dl.acm.org/doi/abs/10.1145/3583780.3615160

Preprints (including METRLA, PEMSBAY, PEMS03, PEMS04, PEMS07, PEMS08 results)

Arxiv link

Performance on Traffic Forecasting Benchmarks

PWC PWC PWC PWC PWC

perf1

image

Required Packages

pytorch>=1.11
numpy
pandas
matplotlib
pyyaml
pickle
torchinfo

Training Commands

cd model/
python train.py -d <dataset> -g <gpu_id>

<dataset>:

  • METRLA
  • PEMSBAY
  • PEMS03
  • PEMS04
  • PEMS07
  • PEMS08

About

[CIKM'23] Official code for our paper "Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer SOTA for Traffic Forecasting".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages