-
Notifications
You must be signed in to change notification settings - Fork 32
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- support computing the gradient of the modified gradient via second order gradients - for this, the second gradient pass must be done without hooks - this is supported by either removing the hooks before computing the second order gradients, or setting for each hook `hook.active = False` by e.g. using the contextmanager `composite.inactive()` before computing the second order gradients - make SmoothGrad and IntegratedGradients inherit from Gradient - add `create_graph` and `retain_graph` arguments for Gradient-Attributors - add `.grad` function to Gradient, which is used by its subclasses to compute the gradient - fix attributor docstrings - recognize in BasicHook.backward whether to use `create_graph=True` for the backward pass in order to compute the relevance by checking whether `grad_output` requires a gradient - add the ReLUBetaSmooth rule, which transforms the gradient of ReLU to the gradient of softplus (i.e. sigmoid); this is used as a surrogate to compute meaningful gradients of ReLU - add test to check effect of hook.active - add test to check whether the second order gradient of Hook is computed as expected - add second order gradient tests for gradient attributors - add test for attributor.inactive - add test for Composite.inactive - add test for ReLUBetaSmooth TODO: - add How-To's
- Loading branch information
Showing
6 changed files
with
381 additions
and
67 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.