Contains notebooks for the PAR tutorial at CVPR 2021.
-
Updated
Jun 29, 2021 - Jupyter Notebook
Contains notebooks for the PAR tutorial at CVPR 2021.
Code of our recently published attack FDA: Feature Disruptive Attack. Colab Notebook: https://colab.research.google.com/drive/1WhkKCrzFq5b7SNrbLUfdLVo5-WK5mLJh
Experimental Adversarial Attack notebooks on CV models
Notebooks exploring image manipulation to trick neural networks.
Data generation and model training notebooks for the paper: Architectural Resilience to Foreground-and-Background Adversarial Noise
Notebook to implement different approaches for Adversarial Attack using Python and PyTorch.
Add a description, image, and links to the adversarial-attacks topic page so that developers can more easily learn about it.
To associate your repository with the adversarial-attacks topic, visit your repo's landing page and select "manage topics."