Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Elastic attack bounds on CIFAR-10 seem to be too large #3

Open
cassidylaidlaw opened this issue Apr 21, 2020 · 1 comment
Open

Elastic attack bounds on CIFAR-10 seem to be too large #3

cassidylaidlaw opened this issue Apr 21, 2020 · 1 comment

Comments

@cassidylaidlaw
Copy link
Contributor

Here is a set of images generated by your elastic attack on a random sample of CIFAR-10 images against a robust model at eps4 (1):

I have no idea what most of these images are. In the cases where some images are recognizable, they have been moved into a different class; for instance, the three "frogs" along the bottom center were two dogs and a horse originally. It seems unreasonable to try and evaluate against such an attack, and you also include two attacks with even greater bounds (eps5 and eps6).

Do you think your methodology is reasonable here? I was hoping to use your UAR score to do some evaluation for a project I'm working on but the bounds for the elastic attack seem too big. The other attacks' bounds seem more reasonable.

@yi-sun
Copy link
Collaborator

yi-sun commented Jun 16, 2020

Apologies for the delayed response, and thanks for pointing this out. For ImageNet, we use targeted attacks to account for the fact that some classes are extremely similar, but as you point out this is less suitable for CIFAR-10. We are looking into how to correct this (subject to compute availability).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants