Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

“Modulated Attention” no use? #50

Closed
GuohongLi opened this issue Nov 18, 2019 · 1 comment
Closed

“Modulated Attention” no use? #50

GuohongLi opened this issue Nov 18, 2019 · 1 comment

Comments

@GuohongLi
Copy link

In ImageNet_LT, I tested stage1 with 'use_selfatt':True or False.
And training results are:
#True
Evaluation_accuracy_micro_top1: 0.219
Averaged F-measure: 0.175
Many_shot_accuracy_top1: 0.422 Median_shot_accuracy_top1: 0.114 Low_shot_accuracy_top1: 0.006
Training Complete.
Best validation accuracy is 0.219 at epoch 30
Done
ALL COMPLETED.

#Fasle
Evaluation_accuracy_micro_top1: 0.220
Averaged F-measure: 0.175
Many_shot_accuracy_top1: 0.427 Median_shot_accuracy_top1: 0.113 Low_shot_accuracy_top1: 0.007
Training Complete.
Best validation accuracy is 0.220 at epoch 30
Done
ALL COMPLETED.

@zhmiao
Copy link
Owner

zhmiao commented Dec 19, 2019

@GuohongLi Thank you very much for asking and sorry for the late reply. We finally debugged the published code and current open set performance is:

============
Phase: test

Evaluation_accuracy_micro_top1: 0.361
Averaged F-measure: 0.501
Many_shot_accuracy_top1: 0.442 Median_shot_accuracy_top1: 0.352 Low_shot_accuracy_top1: 0.175

==========

This is higher than we reported in the paper. We updated some of the modules with clone() method, and set use_fc in the first stage to False. These changes will lead us to the proper results. Please have a try. Thank you very much again.

For Places, the current config won't work either. The reason why we could not get the reported results is that we forget that on the first stage, we actually did not freeze the weights. We only freeze the weights on the second stage. We will update the corresponding code as soon as possible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants