Dropout is a regularisation technique in neural network training where unit activations are randomly set to zero with a given probability independently. In this work, we propose a generalisation of dropout and other multiplicative noise injection schemes for shallow and deep neural networks, where the random noise applied to different units is not independent but follows a joint distribution that is either fixed or estimated during training. We provide theoretical insights on why such adaptive structured noise injection (ASNI) may be relevant, and empirically confirm that it helps boost the accuracy of simple neural networks, disentangles the hidden layer representations, and leads to sparser representations.
-
Notifications
You must be signed in to change notification settings - Fork 0
BeyremKh/ASNI
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Adaptive Structured Noise Injection for neural networks
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published