-
-
Notifications
You must be signed in to change notification settings - Fork 423
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pytorch 0.4.0 compatibility and binary search option for (Iterative)Gradient(Sign)Attack #147
Conversation
Thanks so much @cjsg ! Would you mind adding a simple test for the binary search for both gradient and gradientsign? Then the coverage should recover. @jonasrauber : should we keep pytorch 0.3 compatibility? I am rather inclined to just demand the latest pytorch version, I am not sure why anyone would stay with v0.3. |
Thanks @cjsg, looking into it now! @wielandbrendel In principle I would be fine with dropping support for older PyTorch versions because those users could still use old Foolbox versions, though it could make things more difficult when evaluating old PyTorch models because one would need to downgrade Foolbox. So I guess my question for @cjsg and other PyTorch users is: has the PyTorch community already mostly switched to PyTorch 0.4.0 and are new models released with 0.4.0 compatibility? In any case, the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like it a lot and only have some very minor requests for changes, then I am happy to merge it. See my comments in the code + travis upgrade + 100% code coverage (no coverage mark where absolutely necessary).
In the future, however, I think we need to clean up, unify and simplify the code of the attacks, reduce code duplication, etc.; not just of these attacks but in general.
foolbox/models/pytorch.py
Outdated
images = Variable(images, volatile=True) | ||
predictions = self._model(images) | ||
predictions = predictions.data | ||
if torch.__version__[:3] < '0.4': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This version number comparison could break in the future (e.g. version 0.10.0; given the announcement of v1.0, this will probably never be the case, but still…)
foolbox/attacks/gradient.py
Outdated
@@ -17,7 +18,7 @@ class GradientAttack(Attack): | |||
|
|||
@call_decorator | |||
def __call__(self, input_or_adv, label=None, unpack=True, | |||
epsilons=1000, max_epsilon=1): | |||
epsilons=1000, max_epsilon=1, bin_search=False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we rename this to binary_search
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not just here, of course
foolbox/attacks/gradient.py
Outdated
|
||
_, is_adversarial = a.predictions(perturbed) | ||
|
||
iterator.is_adversarial = is_adversarial |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
using a generator and its send method might be more pythonic, but I think it's fine ;-)
Where are we standing with this? The problem here seems to be that this PR tries to do to much. @cjsg could you split this PR into two parts, one for Pytorch 0.4.0 and the other for the binary search? |
@wielandbrendel I've added PyTorch 0.4.0 support separately from this pull-request (required some more changes than the ones here); it's already released @cjsg you are welcome to do what Wieland proposed, but I have some local re-implementations of parts of the ideas in this pull-request and some other things (e.g. batching) affecting these attacks, so I am not sure if it's worth spending larger amounts of time on this right now ;-) |
This PR implements 4 things: