Skip to content

Latest commit

 

History

History
24 lines (22 loc) · 1.49 KB

README.md

File metadata and controls

24 lines (22 loc) · 1.49 KB

Caffe-LSTM

This is LSTM implementation based on Caffe.

TODO

  • Mini-batch update
  • Examples for the real datasets (i.e., Handwriting recognition/Speech recognition)
  • Peephole connection

Example

An example code is in /examples/lstm_sequence/.
In this code, LSTM network is trained to generate a predefined sequence without any inputs.
This experiment was introduced by Clockwork RNN.
Four different LSTM networks and shell scripts(.sh) for training are provided.
Each script generates a log file containing the predicted sequence and the true sequence.
You can use plot_result.m to visualize the result.
The result of four LSTM networks will be as follows:

  • 1-layer LSTM with 15 hidden units for short sequence Diagram
  • 1-layer LSTM with 50 hidden units for long sequence Diagram
  • 3-layer deep LSTM with 7 hidden units for short sequence Diagram
  • 3-layer deep LSTM with 23 hidden units for long sequence Diagram