Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question in lrp_layers.py #3

Open
S200331082 opened this issue Dec 15, 2022 · 2 comments
Open

question in lrp_layers.py #3

S200331082 opened this issue Dec 15, 2022 · 2 comments

Comments

@S200331082
Copy link

hi @kaifishr
Thanks for your implementation. I'm trying to reimplement lrp on Resnet50, but it has a BatchNorm2D layer in the backbone, I'm a freshman in python and I don't know how to code the RelevancePropagationBatchNorm2D in lrp_layers.py. Can you just give me some ideas? Thanks a lot.

@kaifishr
Copy link
Owner

As BatchNorm2D consists of two consecutive affine linear transformations I would try to weight the relevance scores by the weight parameters of the batch normalization layer learned during the training.

@zbb2022hust
Copy link

Hello, I also meet problems when calculating relevance via the BatchNorm1D layer. I'm not professional on math, but I'm in urgent to use this method to evaluate my FCN model in data-driven fault diagnosis task. Could you add the RelevancePropagationBatchNorm1d/2d in lrp_layers or explain more clear on how to calculate this?
Thanks!
Best regards.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants