You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I do not understand the principle of RM operation on SEBlock, why you concat the original input feature map with input feature plus SE output? just for plain network structure when deploying? Please explain my puzzle thx !!
The text was updated successfully, but these errors were encountered:
RM Operation can remove residual connection across the no-linear layer, besides ReLU mentioned in the original paper, we try to remove residual connection across the SEBlock too. For this purpose, additional channels are needed for reserving the input feature map, thus we concat the original input feature map with input feature plus SE output.
I do not understand the principle of RM operation on SEBlock, why you concat the original input feature map with input feature plus SE output? just for plain network structure when deploying? Please explain my puzzle thx !!
The text was updated successfully, but these errors were encountered: