Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Robust DPatch Attack #149

Open
deprit opened this issue Apr 25, 2024 · 1 comment · Fixed by #155
Open

Implement Robust DPatch Attack #149

deprit opened this issue Apr 25, 2024 · 1 comment · Fixed by #155
Labels

Comments

@deprit
Copy link
Collaborator

deprit commented Apr 25, 2024

Armory Library must support custom evasion attacks designed by the user; supporting user-developed attacks requires the steps below.

  • Design and implement a custom attack API that
    • Optimizes an attack over a training split and, possibly, a validation set;
    • Exposes the optimization loop to PyTorch Lightning;
    • Logs optimization metrics and attack artifiacts to MLFlow.
  • Demonstrate a custom attack that
@deprit deprit added the phoenix label Apr 25, 2024
@treubig26 treubig26 linked a pull request May 21, 2024 that will close this issue
@treubig26
Copy link
Collaborator

This has been partially addressed by PR #155, however more work remains to generalize and integrate the attack optimization as part of the armory-library API.

Specifically, a new attack optimization engine should be created that will handle instantiation of the lightning module (with more hooks/parameterization) and the trainer.

The following comments from the PR should also be addressed in a follow-on PR:

@treubig26 treubig26 reopened this Jun 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants