Skip to content
Robust evasion attacks against neural network to find adversarial examples
Branch: master
Clone or download
carlini Merge pull request #27 from jeromerony/bugfix_l2_attack
Modified prev init value to avoid early breaking
Latest commit 610c43f Oct 19, 2018
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
LICENSE
README.md Update and rename README to README.md Oct 27, 2017
l0_attack.py
l2_attack.py
li_attack.py Assert attack operates pre-softmax Apr 27, 2018
setup_cifar.py Change license to be BSD 2-clause Apr 15, 2017
setup_inception.py
setup_mnist.py
test_attack.py Assert attack operates pre-softmax Apr 27, 2018
train_models.py
verify.py Change license to be BSD 2-clause Apr 15, 2017

README.md

About

Corresponding code to the paper "Towards Evaluating the Robustness of Neural Networks" by Nicholas Carlini and David Wagner, at IEEE Symposium on Security & Privacy, 2017.

Implementations of the three attack algorithms in Tensorflow. It runs correctly on Python 3 (and probably Python 2 without many changes).

To evaluate the robustness of a neural network, create a model class with a predict method that will run the prediction network without softmax. The model should have variables

model.image_size: size of the image (e.g., 28 for MNIST, 32 for CIFAR)
model.num_channels: 1 for greyscale, 3 for color images
model.num_labels: total number of valid labels (e.g., 10 for MNIST/CIFAR)

Running attacks

     from robust_attacks import CarliniL2
     CarliniL2(sess, model).attack(inputs, targets)

where inputs are a (batch x height x width x channels) tensor and targets are a (batch x classes) tensor. The L2 attack supports a batch_size paramater to run attacks in parallel. Each attack has many tunable hyper-paramaters. All are intuitive and strictly increase attack efficacy in one direction and are more efficient in the other direction.

Pre-requisites

The following steps should be sufficient to get these attacks up and running on most Linux-based systems.

    sudo apt-get install python3-pip
    sudo pip3 install --upgrade pip
    sudo pip3 install pillow scipy numpy tensorflow-gpu keras h5py

To create the MNIST/CIFAR models:

python3 train_models.py

To download the inception model:

python3 setup_inception.py

And finally to test the attacks

python3 test_attack.py

This code is provided under the BSD 2-Clause, Copyright 2016 to Nicholas Carlini.

You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.