-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
There is only one SAMS layer when ap_level = 2? #1
Comments
Thanks for your attention to our work. "AP_level =2" denotes that the attention pyramid has both level-1 attention and level-2 attention. It is worth noting that the level-1 attention is the baseline global attention model (e.g. SE attention in the channel-wise model). Only level-2 attention needs a SAMS module to split and merge. It is consistent with our paper (e.g. we write “APNet-C1 only has a global attention map“on page 8). Please see the "Ablation Studies" part for more details. |
Thanks for your reply. I noticed that APNet-C1 slightly outperforms SE-ResNet accodding to TABLE 2 in your paper. It was a little unexpected for me, since APNet-C1 has quite fewer SELayer with different location(after f(x) + x). What do you think about this, please? |
Personally, I think APNet-C1 and SE-ResNet are comparable. It may be because the number of attention layers in APNet-C1 is enough for Person ReID (Market and Duke). In our experiments, we don't find a consistent law between the number of attention layers and ReID performance. |
It seems that SELayer takes no effects. Please see APNet/modeling/backbones/se_resnet.py Line 76 in 74efb12
|
Thank you very much for pointing out this!We have fixed this. It is a typo that occurs when we cleaned the code and released it for git. We only checked the APNet but ignored to check this. This typo has no influence on the conclusions in the paper because it is only in this repo but not in our experiments. Thank you again for helping us find this typo. Please let me know if you have any other questions. |
There is only one SAMS layer when ap_level = 2 accoding to baseline.py. Is that consistent with paper?
The text was updated successfully, but these errors were encountered: