New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
why unsup_loss is 0.000 #117
Comments
It is 0 at the beginning as no pseudo box is generated. However, it should be greater than 0 after several iterations. |
|
另外,我感觉是不是加载一个预训练模型会加速模型的收敛?感觉很多轮依然是0 |
If you are using ms coco, it should increase quickly and be greater than 0 after about 1000~2000 iterations. For other datasets, I am not sure.
Yes. adding |
谢谢,我尝试一下 |
Noticed same problem while training on custom dataset even after large number of iterations. Have you figured what was the issue? |
Maybe you can try to train a supervised model to see the supervised performance and have an impression of how the predictions look like. |
Same I get 0 unsup_loss bbox even though I used |
I am not sure what the problem is. Could you try to train a supervised model and test it on your images to see whether the dataset and model are ready? |
Yes I trained a model using the a config in config/baseline/ and it seems to do well:
When I try with soft teacher it is all zero (i.e. no boxes) |
Is it possible for you to upload the log and the config file here? |
do you solve this problem?I've been stuck with the same problem for days. |
do you solve this problem? |
I also encountered a similar problem, and I find that the bbox score of the teacher model will soon decrease to below the pseudo label threshold within a few iterations, which is strange. |
Maybe it's because the threshold is set too high. The loss is zero because there's no pseudo label. For different datasets and models, the pseudo label thresholds are different. |
The text was updated successfully, but these errors were encountered: