This repository contains the code for Language Model for Text Generation using RNN from scratch using Python and Numpy.
This code implements a basic Recurrent Neural Network for a Language Model that predicts the next word in a sentence and forms a final sentence using proper punctuations.
Note: Use this code to understand the underlaying concept from scratch that how a Language Model works using a RNN with Back Propagation through Time. Running of this code on the PC/Laptop is not advisable as it might take days or weeks to train and generate the text as the data size is large.
- Python 3 and above
- Numpy
python3 rnn_lm.py
The following image shows the working of the code. This image shows the running of code on a small data of 1000 words from the main dataset. We can see that the loss decreases with each epoch. This proves that the code works fine and given the time and compute power, it can generate the text.
- Some generated text samples
Anyway, to the city scene you’re an idiot teenager.
What ? ! ! ! ! ignore!
Screw fitness, you’re saying: https
Thanks for the advice to keep my thoughts around girls.
Yep, please disappear with the terrible generation.
S.No. | Papers / Blogs / Authors | Paper Links |
---|---|---|
1. | Andrej Karpathy's Blog "The Unreasonable Effectiveness of Recurrent Neural Networks" | http://karpathy.github.io/2015/05/21/rnn-effectiveness/ |
2. | Denny Britz's Blog [WILDML] | http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/ |
3. | Hugo Larochelle's Videos on NLP | https://www.youtube.com/watch?v=OzZIOiMVUyM&list=PLSaRkipE5TJhNGgRrpRL180SZ1Mg-2jxT |