Skip to content

svmakarychev/Adaptive-dropout-with-Rademacher-complexity

Repository files navigation

ADAPTIVE DROPOUT WITH RADEMACHER COMPLEXITY REGULARIZATION

This is the project for BMML (Bayesian Methods in Machine Learning) course in Skoltech

In this project we will analyze Adaptive dropout with Radamacher complexity regularization. As for the main article we use https://openreview.net/pdf?id=S1uxsye0Z. Authors propose a novel framework to adaptively adjust the dropout rates for the deep neural network based on a Rademacher complexity bound.

Our plan is the following:

  1. Understand the model
  2. Reproduce experiments on Mnist dataset
  3. Reproduce experiments on Cifar dataset
  4. Compare results with the the approach, duscussed in the article Variational Dropout Sparsifies Deep Neural Networks (https://arxiv.org/pdf/1701.05369.pdf )

Our team: Makarychev Sergey, Rozhnov Alexander.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published