Notebook to implement different approaches for Adversarial Attack using Python and PyTorch.
-
Updated
Feb 24, 2024 - Jupyter Notebook
Notebook to implement different approaches for Adversarial Attack using Python and PyTorch.
Notebooks exploring image manipulation to trick neural networks.
Contains notebooks for the PAR tutorial at CVPR 2021.
Experimental Adversarial Attack notebooks on CV models
Data generation and model training notebooks for the paper: Architectural Resilience to Foreground-and-Background Adversarial Noise
Code of our recently published attack FDA: Feature Disruptive Attack. Colab Notebook: https://colab.research.google.com/drive/1WhkKCrzFq5b7SNrbLUfdLVo5-WK5mLJh
Add a description, image, and links to the adversarial-attacks topic page so that developers can more easily learn about it.
To associate your repository with the adversarial-attacks topic, visit your repo's landing page and select "manage topics."