-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use of hooks #18
Comments
Hello, unfortunately, I don't have a good answer to you other than: autograd just expects a tuple. You can change the tuple to a tensor (or any other thing) and get the following error: Traceback (most recent call last):
File "guided_backprop.py", line 74, in <module>
guided_grads = GBP.generate_gradients(prep_img, target_class)
File "guided_backprop.py", line 59, in generate_gradients
model_output.backward(gradient=one_hot_output)
File ".../torch/tensor.py", line 93, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File ".../torch/autograd/__init__.py", line 89, in backward
... allow_unreachable=True) # allow_unreachable flag
TypeError: expected tuple, but hook returned 'Tensor' From there you can trace it to torch/tensor.py and torch/autograd but in the end I couldn't find a good answer other than, its just what it is. Also, just to clarify, this hook function only affects backward pass and not forward pass, so ReLU(x) still returns modified_ReLU(x) but in the backward pass the gradients are affected. |
Got it. Thanks. Your answer led me to read into One quick question. The |
Thanks for the offer but many implementations are quite similar, and very simple but this is by design so that people can grasp these concepts and adapt to their own need without digging through hundreds of lines of code. |
Thank you for your awesome code.
I have a quick question about the use of
hooks
in some of your code.In src/guided_backprop.py, it returns some tuple.
How is this being applied to the
self.gradients
? I tried looking up uses of hook layers, but couldn't find an example that resembles that of yours. More specifically, whenupdate_relus
returns thosetuple
s, where do they go? I am guessing it replaces thegrad_in
inrelu_hook_function
inplace, then passed tohook_function
inhook_layers
, but wanted to make sure. Any guidance would be appreciated.The text was updated successfully, but these errors were encountered: