Skip to content
The official implementation of the Gaussian moment regularizer.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

Gaussian Regularizer

Despite the impressive performance of deep neural networks (DNNs) on numerous vision tasks, they still exhibit yet-to-understand uncouth behaviours. One puzzling behaviour is the subtle sensitive reaction of DNNs to various noise attacks. Such a nuisance has strengthened the line of research around developing and training noise-robust networks. In this work, we propose a new training regularizer that aims to minimize the probabilistic expected training loss of a DNN subject to a generic Gaussian input. We provide an efficient and simple approach to approximate such a regularizer for arbitrary deep networks. This is done by leveraging the analytic expression of the output mean of a shallow neural network; avoiding the need for the memory and computationally expensive data augmentation. We conduct extensive experiments on LeNet and AlexNet on various datasets including MNIST, CIFAR10, and CIFAR100 demonstrating the effectiveness of our proposed regularizer. In particular, we show that networks that are trained with the proposed regularizer benefit from a boost in robustness equivalent to performing 3-21 folds of data augmentation.


All the requirements are listed in requirements.txt.

pip install -r requirements.txt
python --help

The training logs for the experiments reported in the paper are all published under releases as a single zip file.


This is the official implementation of the method described in this paper:

    author = {Alfadly, Modar and Bibi, Adel and Ghanem, Bernard},
    title = {Analytical Moment Regularizer for Gaussian Robust Networks},
    month = {April},
    year = {2019}




Modar M. Alfadly


I would gladly accept any pull request that improves any aspect of this repository.

You can’t perform that action at this time.