Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MeanAveragePrecision - bug in max_detection_thresholds #2560

Closed
nisyad-ms opened this issue May 23, 2024 · 3 comments
Closed

MeanAveragePrecision - bug in max_detection_thresholds #2560

nisyad-ms opened this issue May 23, 2024 · 3 comments
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.2.x

Comments

@nisyad-ms
Copy link

nisyad-ms commented May 23, 2024

🐛 Bug

MeanAveragePrecision return map=-1 for any value other than 100 for max_detection_thresholds

To Reproduce

from torch import tensor
from torchmetrics.detection import MeanAveragePrecision
preds = [dict(boxes=tensor([[0, 0, 100, 100],
                      [0, 0, 50, 50]]),
              scores=tensor([1.0, 0.9]),
              labels=tensor([0, 1]))]

target = [dict(boxes=tensor([[0, 0, 100, 100],
                      [0, 0, 50, 50]]),
               labels=tensor([0, 1]),)]
metric = MeanAveragePrecision(iou_type="bbox", max_detection_thresholds=[1, 10, 50])
metric.update(preds, target)
result = metric.compute()
result  # map = -1

Expected behavior

Expected: map=1 (for 50 max detections)

Environment

  • TorchMetrics version (and how you installed TM, e.g. conda, pip, build from source): 1.2.1
  • Python & PyTorch Version (e.g., 1.0): 3.10
  • Any other relevant information such as OS (e.g., Linux): Ubuntu 22.04
@nisyad-ms nisyad-ms added bug / fix Something isn't working help wanted Extra attention is needed labels May 23, 2024
@Borda Borda changed the title MeanAveragePrecision - bug in max_detection_thresholds MeanAveragePrecision - bug in max_detection_thresholds May 24, 2024
@Borda Borda added the v1.2.x label May 24, 2024
@SkafteNicki
Copy link
Member

Hi @nisyad-ms, thanks for reporting this issue.
Sadly this is due to a known bug in the official pycocotools backend, that we use for computations. Specifically, this line:
https://github.com/cocodataset/cocoapi/blob/8c9bcc3cf640524c4c20a9c40e89cb6a2f2fa0e9/PythonAPI/pycocotools/cocoeval.py#L460
should have been

stats[0] = _summarize(1, maxDets=self.params.maxDets[2])

for it to work. Sadly the repo is not really maintained anymore, but is still considered the official reference for mAP.

Instead you can install the faster-coco-eval backend (https://github.com/MiXaiLL76/faster_coco_eval) which we also supports. This backend have implemented the fix so your code calculates the correct value.

metric = MeanAveragePrecision(iou_type="bbox", max_detection_thresholds=[1, 10, 50], backend="faster_coco_eval")
metric.update(preds, target)
result = metric.compute()
print(result)

#{'map': tensor(1.), ...

Closing issue because we really cannot fix this on our side.

@nisyad-ms
Copy link
Author

Thanks @SkafteNicki for the information. How to ensure the faster-coco-eval backend is used? Just installing it will do? Thanks again.

@SkafteNicki
Copy link
Member

Thanks @SkafteNicki for the information. How to ensure the faster-coco-eval backend is used? Just installing it will do? Thanks again.

Sorry I should have specified that. You install the backend with pip install faster-coco-eval and then when initializing the MeanAveragePrecision class you need to set the backend argument to "faster_coco_eval".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.2.x
Projects
None yet
Development

No branches or pull requests

3 participants