A theano implementation of Hinton's dropout.
Switch branches/tags
Nothing to show
Clone or download


Theano implementation of dropout.  See: http://arxiv.org/abs/1207.0580

Run with:

     ./mlp.py dropout

for dropout, or 

    ./mlp.py backprop

for regular backprop with no dropout.


    ./plot_results.sh results.png

to visualize the results.

Based on code from:
- http://deeplearning.net/tutorial/mlp.html
- http://deeplearning.net/tutorial/logreg.html

Use the data here to make the units of the results comparable to Hinton's paper:
- http://www.cs.ox.ac.uk/people/misha.denil/hidden/mnist_batches.npz