Skip to content

ahmedmazariML/Dropout-as-a-Bayesian-Approximation-and-Batch-Normalization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Dropout-as-a-Bayesian-Approximation-and-Batch-Normalization

Comparative study : Dropout as a Bayesian Approximation and Batch Normalization

We study the importance of regularization in deep learning models. Two regular- ization techniques are provided : dropout and batch normalization. Up to know, dropout remains the most popular choice for simplicity . Besides, batch normalization outperforms state of art performance in computer vision and eliminates the need of dropout. However, dropout offers insight into the model uncertainty of the deep neural network when it is performed during testing and it can be seen as a bayesian Approximation.We first give a general introduction to overfitting and regularization. Then, we show how dropout captures model uncertainty and how batch normalization fixes the input distribution, and allows deep learning models to learn faster (fast regularization). After that, we discuss the results obtains by both method in different application domains. Finally, we give our intuition about it and our perspectives.

Original papers :

Dropout as a Bayesian approximation: Representing model uncertainty in deep learning : https://arxiv.org/abs/1506.02142

Batch normalization: Accelerating deep network training by reducing internal covariate shift : https://arxiv.org/abs/1502.03167

Class taught by :

Antoine Cornuéjols : http://www.agroparistech.fr/ufr-info/membres/cornuejols/

Class link : http://www.agroparistech.fr/ufr-info/membres/cornuejols/Teaching/Master-AIC/M2-AIC-advanced-ML.html

About

Comparative study : Dropout as a Bayesian Approximation and Batch Normalization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages