-
Notifications
You must be signed in to change notification settings - Fork 466
Fix for incorrect mean AP #1330
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix for incorrect mean AP #1330
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we pls add a test to verify the correctness and not break it later again...
Sorry for the late response @Borda Yes, will do. |
@Borda I have added the test. Please check. A doctest failed. Checking. |
bae54ee
to
1621983
Compare
for more information, see https://pre-commit.ci
Could you please speed up approving this PR?) |
I see it as a draft so you say it is ready for review? |
Hi @dhananjaisharma10 , We found this issue in our system and I was going to look at it. Fortunately, I checked the open issues @Borda sent me and saw that you have a PR for it, which fixes it and seems to be green now. Could you please remove the draft so we all can benefit from the fix? If you need more time / help to adjust anything, just let me know. Cheers! |
@wilderrodrigues what do you mean by removing the draft and still benefiting from it? Do you want to say merging it? If you need more time/help to adjust anything, just let me know. I am not sure if the fix is ready, @SkafteNicki @twsl |
@homomorfism we cannot approve this PR as long as tests are failing as this indicates wrong behaviour. @wilderrodrigues @dhananjaisharma10 Is anyone of you familiar with detection metrics, hast the bandwidth and wants to take this to merging? Unfortunately I have very little knowledge on detection... |
@homomorfism @wilderrodrigues, could you please hekp with debugging the last three failing tests? |
Hi @Borda, could you please help me with running the tests locally. I am trying to find why they are failing. They do not seem directly related to my fix - but I could be wrong. |
the error states that the expected values are different from what the metrics returns
that is fine :) |
Looking into it now. There is also some discrepancy in the mAP results. It's a bit larger than with my tests. What is weird is that locally (MacBook) and on a RTX 3090 with Ubuntu 20.04, those tests are passing.
|
@dhananjaisharma10, we value your work and thank you for your contribution 💜 could you pls revisit this PR with respect to the recently merged revert to COCO implementation #1327 and setting the torch version as an Internal option? 🐰 |
Once again thank you for all your effort! 💜 |
What does this PR do?
Fixes #1184
Adds a fix for incorrect mean AP when there are no GT bboxes for a class, but there are predicted boxes. This can happen in the following cases:
Solution: Make Precision for that class
0
.Note: I am yet to write tests. I tried running them at
tests/unittests/detection/test_map.py
but faced aFileNotFoundError
exception because of the following missing file:_SAMPLE_DETECTION_SEGMENTATION = os.path.join(_PATH_ROOT, "_data", "detection", "instance_segmentation_inputs.json")
Please check the code inside the issue and check the before vs. after outputs below.
Before
After
Other changes:
gtignore
to early-return from the function. This will save computation.Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃