Skip to content

A Long Short Term Memory neural network for time series prediction. Memory blocks contain one memory cell in each. Weights for the network are randomly initialized.

License

Notifications You must be signed in to change notification settings

heshanera/LSTMnet

Repository files navigation

LSTMnet

License: GPL v3   language   

LSTM network

The LSTM network is implemented with memory blocks containing one memory cell in each block. input layer is fully connected to the hidden layer. The weights for the network are randomly initialized. All the gates in a memory cell have bias values and they are initialized randomly and adjusted while training the network. Input vector for the network is the previous data points of the time series. Size of this input vector can be adjusted. The number of training timesteps to the future, the number of training iterations and the learning rate of the network can be adjusted accordingly.

Network Architecture

structure

Creating A Network

initializing variables
int memCells = 5; // number of memory cells
int trainDataSize = 300; // train data size
int inputVecSize = 60; // input vector size
int timeSteps = 60; // unfolded time steps
float learningRate = 0.02; // leraning rate
int predictions = 1300; // prediction points
int iterations = 10; // training iterations with training data
Initializing the network
LSTMNet lstm(memCells,inputVecSize);
Training
lstm.train(input, targetVector, trainDataSize, timeSteps, learningRate, iterations);
Testing
double result;
result = lstm.predict(input);

About

A Long Short Term Memory neural network for time series prediction. Memory blocks contain one memory cell in each. Weights for the network are randomly initialized.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published