Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There is only one SAMS layer when ap_level = 2? #1

Closed
hiyijian opened this issue Sep 2, 2021 · 5 comments
Closed

There is only one SAMS layer when ap_level = 2? #1

hiyijian opened this issue Sep 2, 2021 · 5 comments

Comments

@hiyijian
Copy link

hiyijian commented Sep 2, 2021

There is only one SAMS layer when ap_level = 2 accoding to baseline.py. Is that consistent with paper?

@CHENGY12
Copy link
Owner

CHENGY12 commented Sep 2, 2021

Thanks for your attention to our work. "AP_level =2" denotes that the attention pyramid has both level-1 attention and level-2 attention. It is worth noting that the level-1 attention is the baseline global attention model (e.g. SE attention in the channel-wise model). Only level-2 attention needs a SAMS module to split and merge. It is consistent with our paper (e.g. we write “APNet-C1 only has a global attention map“on page 8). Please see the "Ablation Studies" part for more details.
Please do not hesitate to contact me if you have any additional questions.

@hiyijian
Copy link
Author

hiyijian commented Sep 2, 2021

Thanks for your reply. I noticed that APNet-C1 slightly outperforms SE-ResNet accodding to TABLE 2 in your paper. It was a little unexpected for me, since APNet-C1 has quite fewer SELayer with different location(after f(x) + x). What do you think about this, please?

@CHENGY12
Copy link
Owner

CHENGY12 commented Sep 2, 2021

Personally, I think APNet-C1 and SE-ResNet are comparable. It may be because the number of attention layers in APNet-C1 is enough for Person ReID (Market and Duke). In our experiments, we don't find a consistent law between the number of attention layers and ReID performance.

@hiyijian hiyijian closed this as completed Sep 3, 2021
@hiyijian hiyijian reopened this Sep 3, 2021
@hiyijian
Copy link
Author

hiyijian commented Sep 3, 2021

It seems that SELayer takes no effects. Please see

zout = self.se(out)

@CHENGY12
Copy link
Owner

CHENGY12 commented Sep 3, 2021

Thank you very much for pointing out this!We have fixed this. It is a typo that occurs when we cleaned the code and released it for git. We only checked the APNet but ignored to check this. This typo has no influence on the conclusions in the paper because it is only in this repo but not in our experiments. Thank you again for helping us find this typo. Please let me know if you have any other questions.

@hiyijian hiyijian closed this as completed Sep 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants