text-rnn allows you to create modern neural network architectures which use modern techniques such as skip-embedding and attention weighting. Train either a bidirectional or normal LSTM recurrent neural network to generate text using any dataset. You can continue training a pre-trained model.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
datasets
models
rnn
.gitattributes
.gitignore
LICENSE
README.md
main.py

README.md

text-rnn

text-rnn allows you to create modern neural network architectures which use modern techniques such as skip-embedding and attention weighting. It trains and generate text at the character-level. It also uses the CuDNN implementation when trained on GPUs which significantly improves training time when compared to the usual implementation of LSTMs.

You can configure whether to use bidirectional RNNs, the number of RNN layers, RNN size, input length, and size of the embedding layer.

If you would like to train using a free GPU check out this Colaboratory notebook.