Skip to content
/ ASNI Public

Adaptive Structured Noise Injection for neural networks

Notifications You must be signed in to change notification settings

BeyremKh/ASNI

Repository files navigation

ASNI

Dropout is a regularisation technique in neural network training where unit activations are randomly set to zero with a given probability independently. In this work, we propose a generalisation of dropout and other multiplicative noise injection schemes for shallow and deep neural networks, where the random noise applied to different units is not independent but follows a joint distribution that is either fixed or estimated during training. We provide theoretical insights on why such adaptive structured noise injection (ASNI) may be relevant, and empirically confirm that it helps boost the accuracy of simple neural networks, disentangles the hidden layer representations, and leads to sparser representations.

Results

MNIST 32 activations

MNIST 64 activations

MNIST 256 activations

MNIST 512 activations

MNIST 1024 activations

MNIST fast

CIFAR-10

CIFAR-10 fast

CIFAR-100

CIFAR-100 fast

MNIST varying

Simulation

About

Adaptive Structured Noise Injection for neural networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published