Skip to content

KeyWgh/DemystifyBackdoor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

Demystifying Poisoning Backdoor Attacks from a Statistical Perspective

Introduction

We develop a theoretic understanding of the factors contributing to effective poisoning backdoor attacks. This repo holds our code that conducts both simulated and real-world data experiments to demonstrate the developed theory.

Files

  • src/experiment.py: Run the experiments on synthetic two-dimensional Gaussian datasets.
  • src/diffusion_backdoor.ipynb: Backdoor attacks for diffusion models on MNIST.

Two scripts were tested in an environment with PyTorch 2.0.1, CUDA 11.8, and Python 3.10.

Reference

Ganghua Wang, Xun Xian, Jayanth Srinivasa, Ashish Kundu, Xuan Bi, Mingyi Hong, and Jie Ding. “Demystifying Poisoning Backdoor Attacks from a Statistical Perspective,” International Conference on Learning Representations (ICLR), 2024. link

@article{wang2024demystify,
   title={Demystifying Poisoning Backdoor Attacks from a Statistical Perspective},
   author={Wang, Ganghua and Xian, Xun and Srinivasa, Jayanth and Kundu, Ashish and Bi, Xuan and Hong, Mingyi and Ding, Jie},
   journal={Proc. ICLR},
   year={2024} }

Contact

If you have any questions, please feel free to contact us or submit an issue.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published