Skip to content

928082786/pnoiseattack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pnoiseattack(paper code)

  • We propose a noval global backdoor trigger called Procedural Noise Trigger, which is generated by the procedural noise.

In recent years, more researchers pay their attention to the security of artificial intelligence. The backdoor attack is one of the threats and has a powerful, stealthy attack ability. There exist a growing trend towards the triggers is that the triggers become dynamic and global. In this paper, we propose a novel global backdoor trigger generated by procedural noise. Our backdoor triggers are much stealthy and straightforward to implement compared with most triggers. There are three types of procedural noise, and we evaluate the attack ability for the triggers with them on the different classification datasets, including CIFAR-10, GTSRB, CelebA, and ImageNet12.The experiment results show that our attack approach can bypass most defense approaches, even for the inspections of humans. We only need poison 5%-10% training data, and the attack success rate(ASR) can reach over 99%. To test the robustness of the backdoor model against the corruption methods that in practice, we introduce 17 corruption methods and compute the accuracy, attack success rate(ASR) of the backdoor model. The facts show that the backdoor models generated by our approaches have strong robustness for the most corruption methods and can be applied in reality. Our code is available at https://github.com/928082786/pnoiseattack.

Poster

Poster

The demo of poisoned data on different datasets

Demo

Releases

No releases published

Packages

No packages published

Languages