Skip to content

An example code of implement of PGD and FGSM algorithm for adversarial attack

Notifications You must be signed in to change notification settings

Kaminyou/PGD-Implemented-Adversarial-attack-on-CIFAR10

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PGD-Implemented-Adversarial-attack-on-CIFAR10

An example code of implement of PGD and FGSM algorithm for adversarial attack

Pretrained model

The pretrained models is from here
Please download the pretrained models first and put them in the /cifar10_models/state_dicts as instruction in above link.

Prepare normal examples

Please prepare your cifar-10 normal example and put all the classes in different folders. That is:
| imgs/
| - frog
| -- frog1.png frog2.png ......
...
| - automobile
...

Generate adversarial example


$python3 main.py -I input_normal_examples_path -M model -T mode -O adversarial_examples_folder_name
model: vgg16_bn, resnet50, mobilenet_v2, densenet161
mode: PGD, FGSM

Investigate transferability


$python3 transferability.py -I input_normal_examples_path -O 1or0
O: if generate confusion table or not

About

An example code of implement of PGD and FGSM algorithm for adversarial attack

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages