Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion about the Attention Map Generation of distribute branch #5

Closed
KamInNg opened this issue Oct 5, 2018 · 7 comments
Closed

Comments

@KamInNg
Copy link

KamInNg commented Oct 5, 2018

Hi,@hszhao

It seems that it is same between the collect branch and distributes branch(show on Fig. 3).
Could you show the equation of the distribute branch(like the equation(eq.9) of the collect branch show on the paper)?

@KamInNg KamInNg changed the title Confus about the Attention Map Generation from the distribute branch Confusion about the Attention Map Generation of distribute branch Oct 5, 2018
@qiulesun
Copy link

qiulesun commented Nov 3, 2018

@KamInNg, @hszhao I have the same confusion. Could you let me know if you understand the differences between the two branches?

@EthanZhangYi
Copy link
Collaborator

Hi @KamInNg @qiulesun
The difference between the two branch is in attention map generation step. (Section 3.3 in ECCV paper, equation 9)
selection_007

You can also refer to released codes:
https://github.com/hszhao/PSANet/blob/master/src/caffe/layers/pointwise_spatial_attention_layer.cu#L66
https://github.com/hszhao/PSANet/blob/master/src/caffe/layers/pointwise_spatial_attention_layer.cu#L201

@wangq95
Copy link

wangq95 commented Nov 19, 2018

Hi, @EthanZhangYi , thanks for your advice, but I am still confused about that the over-completed H which channels' number is (2H-1)*(2W-1) stands for the realitive positon information fo i and j. Can you help me, best regards.

@EthanZhangYi
Copy link
Collaborator

EthanZhangYi commented Nov 19, 2018

@QianshengGu
a

@wangq95
Copy link

wangq95 commented Nov 19, 2018

Hi, @EthanZhangYi , I'm almost understand this mechanism, but the number of Δji directly equal to (2H-1)*(2W-1) still bother me, while it is not much clear in the ECCV paper. Can you provide a more clear explanation? I will be much grateful for your help. Best regards!

@EthanZhangYi
Copy link
Collaborator

In order to cover the whole image,we take all possible relative position in the feature map into account. Given each pair of points in the map, we can calculate their relative position. Then for all pair of points, the relative height ranges from 1-H to H-1 (2H-1 possible values), and the relative width ranges from 1-W to W-1 (2W-1 possible values). The total number of possible relative positions is (2H-1)*(2W-1)

@wangq95
Copy link

wangq95 commented Nov 19, 2018

@EthanZhangYi Get it! Thank you very much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants