Skip to content

2M-kotb/LSTM-based-Stacked-Autoencoder

Repository files navigation

LSTM-based-Stacked-Autoencoder

This is the implementation of LSTM-based Staked Autoencoder (LSTM-SAE) model

This model is mentioned in paper by the title:

Unsupervised Pre-training of a Deep LSTM-based Stacked Autoencoder for Multivariate Time Series Forecasting Problems

There are six python files and one csv file:

  1. pollution.csv --> Multivariate data set
  2. 1layer_selection.py---> this code for hyperparameter selction of the model and 1 referes to one hidden layer
  3. 1layer_evaluate.py ---> this code for evaluating the model and also 1 means one hidden layer

Environment

OS: Ubuntu 17.10

OS type: 64-bit

used libraries:

1- Keras (2.1.5)

2- tensorflow-gpu (1.10.0)

3- hyperopt (0.1.1)

4- pandas (0.24.2)

5- scikit-learn (0.20.3)

6- scipy (1.1.0)

7- numpy (1.15.1)

8- matplotlib (2.0.0)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages