Skip to content

Commit

Permalink
Composite: fix passing of beta_smooth value
Browse files Browse the repository at this point in the history
The BetaSmooth composite did not pass the beta_smooth value to the ReLUBetaSmooth hook constructor resulting in beta_smooth==10 at all times.
  • Loading branch information
annahdo authored and chr5tphr committed Nov 17, 2022
1 parent d46f3e7 commit 5f14c27
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/zennit/composites.py
Original file line number Diff line number Diff line change
Expand Up @@ -494,6 +494,6 @@ def __init__(self, beta_smooth=10., layer_map=None, zero_params=None, canonizers
layer_map = []

layer_map = layer_map + [
(torch.nn.ReLU, ReLUBetaSmooth()),
(torch.nn.ReLU, ReLUBetaSmooth(beta_smooth=beta_smooth)),
]
super().__init__(layer_map=layer_map, canonizers=canonizers)

0 comments on commit 5f14c27

Please sign in to comment.