Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LRP model checks and support unknown layers via AD fallback #26

Merged
merged 18 commits into from
Mar 3, 2022

Conversation

adrhill
Copy link
Member

@adrhill adrhill commented Mar 1, 2022

closes #25

Model checks

Adds the check_model(:LRP, model) function to check models for compatibility with LRP. This function is called by default in the constructor of all LRP analyzers and returns a summary using PrettyTables.jl (shown in screenshots below).

These checks can be fully skipped using the LRP analyzer keyword argument skip_checks for users who know what they are doing. Alternatively, individual custom layers and activation functions can be registered:

LRP_CONFIG.supports_layer(::MyLayer) = true               # for structs
LRP_CONFIG.supports_layer(::typeof(mylayer)) = true       # for functions

LRP_CONFIG.supports_activation(::MyActivation) = true        # for structs
LRP_CONFIG.supports_activation(::typeof(myfunction)) = true  # for functions

Custom layer support

This registration mechanism opens up the possibility to define LRP rules on custom layers.
For this purpose, type restrictions on the default automatic differentiation fallback have been removed.

By default, the following AD fallback will be applied to registered but unknown layers:

function (rule::AbstractLRPRule)(layer, aₖ, Rₖ₊₁)
    layerᵨ = modify_layer(rule, layer)
    function fwpass(a)
        z = layerᵨ(a)
        s = Zygote.dropgrad(Rₖ₊₁ ./ modify_denominator(rule, z))
        return z  s
    end
    c = gradient(fwpass, aₖ)[1]
    Rₖ = aₖ .* c
    return Rₖ
end

Where by default modify_layer(rule, layer) = layer and modify_denominator(rule, z) = stabilize_denom(d; eps=1.0f-9). These can be extended by the user for any combination of rule and custom layer.

The user can also use multiple dispatch to implement fully custom rules using this interface:

function (rule::AbstractLRPRule)(layer::MyLayer, aₖ, Rₖ₊₁)
    # ...
    return Rₖ
end

Screenshots

Unknown/unsupported activation function

Screenshot 2022-03-02 at 19 59 47

Unknown layer

Screenshot 2022-03-02 at 20 00 41

@codecov-commenter
Copy link

codecov-commenter commented Mar 1, 2022

Codecov Report

Merging #26 (55e361b) into master (614bdb7) will decrease coverage by 4.77%.
The diff coverage is 80.70%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #26      +/-   ##
==========================================
- Coverage   95.73%   90.95%   -4.78%     
==========================================
  Files           8        9       +1     
  Lines         164      199      +35     
==========================================
+ Hits          157      181      +24     
- Misses          7       18      +11     
Impacted Files Coverage Δ
src/lrp_checks.jl 69.44% <69.44%> (ø)
src/flux.jl 96.29% <100.00%> (ø)
src/lrp.jl 93.10% <100.00%> (-0.23%) ⬇️
src/lrp_rules.jl 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 614bdb7...55e361b. Read the comment docs.

@adrhill
Copy link
Member Author

adrhill commented Mar 2, 2022

Only needs some updated doc strings and documentation!

@adrhill adrhill changed the title Add check_model Add LRP model checks and support unknown layers via AD fallback Mar 3, 2022
@adrhill adrhill marked this pull request as ready for review March 3, 2022 18:16
@adrhill adrhill merged commit 11ea2b7 into master Mar 3, 2022
@adrhill adrhill deleted the ah/model-checks branch March 3, 2022 18:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add model checks for LRP
2 participants