Batch-Normalized LSTM (Recurrent Batch Normalization) implementation in Torch.
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
LSTM.lua
README.md

README.md

Recurrent Batch Normalization

Batch-Normalized LSTMs

Tim Cooijmans, Nicolas Ballas, César Laurent, Çağlar Gülçehre, Aaron Courville

http://arxiv.org/abs/1603.09025

Usage

local rnn = LSTM(input_size, rnn_size, n, dropout, bn)

n = number of layers (1-N)

dropout = probability of dropping a neuron (0-1)

bn = batch normalization (true, false)

Example

https://github.com/iassael/char-rnn

Performance

Validation scores on char-rnn with default options

Implemented in Torch by Yannis M. Assael (www.yannisassael.com)