Skip to content
Attention Mechanism for Multivariate Time Series Forecasting
Branch: master
Clone or download
Latest commit 598cfc9 Nov 29, 2018
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
lib Remove pycache. Nov 24, 2018
.gitignore Initial commit. Nov 24, 2018
README.md Update README.md Nov 29, 2018
main.py Initial commit. Nov 24, 2018
requirements.txt Initial commit. Nov 24, 2018

README.md

TPA-LSTM

Original Implementation of ''Temporal Pattern Attention for Multivariate Time Series Forecasting''.

Dependencies

  • python3.6.6

You can check and install other dependencies in requirements.txt.

$ pip install -r requirements.txt
# to install TensorFlow, you can refer to https://www.tensorflow.org/install/

Usage

The following example usage shows how to train and test a TPA-LSTM model on MuseData with settings used in this work.

Training

$ python main.py --mode train \
    --attention_len 16 \
    --batch_size 32 \
    --data_set muse \
    --dropout 0.2 \
    --learning_rate 1e-5 \
    --model_dir ./models/model \
    --num_epochs 40 \
    --num_layers 3 \
    --num_units 338

Testing

$ python main.py --mode test \
    --attention_len 16 \
    --batch_size 32 \
    --data_set muse \
    --dropout 0.2 \
    --learning_rate 1e-5 \
    --model_dir ./models/model \
    --num_epochs 40 \
    --num_layers 3 \
    --num_units 338
You can’t perform that action at this time.