Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: Position of BN in BAM. #49

Open
Ujjawal-K-Panchal opened this issue Jan 13, 2023 · 2 comments
Open

Question: Position of BN in BAM. #49

Ujjawal-K-Panchal opened this issue Jan 13, 2023 · 2 comments

Comments

@Ujjawal-K-Panchal
Copy link

I think there is possibly an inconsistency in the location of the application of the Batch Normalization between the paper and the code.

1st instance of possible inconsistency:

The paper mentions on Page 4, Section 3, subsection Channel attention branch that:

image

So it is understod that the batch norm is applied at the end of the MLP, i.e. after the final layer. However, from the implementation, shows final layer in forward() function as:

return self.gate_c( avg_pool ).unsqueeze(2).unsqueeze(3).expand_as(in_tensor)

Where self.gate_c's final layer is defined as:

self.gate_c.add_module( 'gate_c_fc_final', nn.Linear(gate_channels[-2], gate_channels[-1]) )

@Ujjawal-K-Panchal
Copy link
Author

2nd instance of possible inconsistency:

The paper mentions on Page 5, Section 3, subsection Spatial attention branch that:

image

So it is understod that the batch norm is applied at the end of the spatial attention branch, i.e. after the final Linear layer. However, from the implementation, shows final layer in forward() function as:

return self.gate_s( in_tensor ).expand_as(in_tensor)

Where self.gate_s's final layer is defined as:

self.gate_s.add_module( 'gate_s_conv_final', nn.Conv2d(gate_channel//reduction_ratio, 1, kernel_size=1) )

@Ujjawal-K-Panchal
Copy link
Author

One might suspect that the final BN layers can be added in the BAM() class.
However, looking at the BAM() class:

att = 1 + F.sigmoid( self.channel_att(in_tensor) * self.spatial_att(in_tensor) )

it becomes clear that the BN layer is not applied before the combination of $M_c \text{ and } M_s$.
Is this an inconsistency?

@Ujjawal-K-Panchal Ujjawal-K-Panchal changed the title Inconsistency: Position of BN in BAM. Question: Position of BN in BAM. Jan 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant