You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Replace the register_full_backward_hook with a forward hook and get grad values from the tensors.
Motivation
First of all, thank you for the great package. Unfortunately, I cannot use provided package when I have in-place nonlinear submodules. So, I am suggesting going with the solution provided here: pytorch/pytorch#61519
Pitch
So, the idea is simple. You may replace all register_full_backward_hook with the register_forward_hook and then add register_hook on output or input tensors if needed to get the value of the gradients. I hope this makes sense and can help you to enhance your module.
The text was updated successfully, but these errors were encountered:
Summary:
This switches usage of full backward hooks to instead apply forward hooks which then add tensor backward hooks, as suggested in #914 . We initially did not choose this approach since it may have limitations with backward hooks on modules with multiple tensors as inputs / outputs (each tensor must be called independently in the hook), but all current use-cases within Captum only require a single tensor input / output.
This change allows us to enable in-place modules as well as remove the limitation on neuron input attribution. DeepLift also no longer needs valid module checks, as these are no longer applicable with usage of tensor hooks.
Pull Request resolved: #979
Reviewed By: NarineK
Differential Revision: D41687791
Pulled By: vivekmig
fbshipit-source-id: 2ddc5aac7b9bf70a56ffb3ace3dc026fca7d4bfa
🚀 Feature
Replace the register_full_backward_hook with a forward hook and get grad values from the tensors.
Motivation
First of all, thank you for the great package. Unfortunately, I cannot use provided package when I have in-place nonlinear submodules. So, I am suggesting going with the solution provided here: pytorch/pytorch#61519
Pitch
So, the idea is simple. You may replace all register_full_backward_hook with the register_forward_hook and then add register_hook on output or input tensors if needed to get the value of the gradients. I hope this makes sense and can help you to enhance your module.
The text was updated successfully, but these errors were encountered: