-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation issue #5
Comments
These masks are almost identical to the bounding boxes. I wonder what other magic tricks the authors have used to make instance masks as tight as reported in the paper??? |
Hi vnbot2, The result seems weird. I have checked the result of the previous experiments and I am absolutely sure that is not how my result looks like. I will run the whole experiment again based on the released code to check if there is any problem later. Moreover, I noticed that the labels in the figures are all wrong(e.g., classify the cat as a truck). Can you provide more details about your training dataset, annotation format, training iterations, and training log? I will let you know if I found anything. Cheng-Chun |
Hi, @chengchunhsu @vnbot2, we run the released code on VOC 2012 dataset (not the augmented version), and the results seem to be reasonable. However, we have not found the code the compute the mask AP. Could you please update your evaluation code? And if possible, could you please provide your VOCSBD json annotation file? We find the json download link is not available. Thank you! |
Same request. Besides, I run the code with sh train_voc_aug.sh and it converges to loss ~0.13. But the test mAP is about 0. Is there any insights? |
I used this config [e2e_mask_rcnn_R_101_FPN_4x_voc_coco_aug_cocostyle] to retrain and seems like the model is not trained correctly with the two new losses for mask ?, After training the result for masks are just rectangle masks, i'm wondering if implementation is the same wit
h the one in your paper?
The text was updated successfully, but these errors were encountered: