Skip to content

Run fully connected artificial neural networks with dropout applied (mini)batchwise, rather than samplewise. Given two hidden layers each subject to 50% dropout, the corresponding matrix multiplications for forward- and back-propagation is 75% less work as the dropped out units are not calculated.

master
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

README.md

Batchwise Dropout Benjamin Graham, University of Warwick, 2015 GPLv3

If you use this software please tell me what you are using it for (b.graham@warwick.ac.uk).

Run "make dataset" for dataset in the list { mnist, cifar10, artificial }

About

Run fully connected artificial neural networks with dropout applied (mini)batchwise, rather than samplewise. Given two hidden layers each subject to 50% dropout, the corresponding matrix multiplications for forward- and back-propagation is 75% less work as the dropped out units are not calculated.

Resources

Releases

No releases published

Packages

No packages published
You can’t perform that action at this time.