Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

MAP handling an empty prediction array. #601

Closed
SBoulanger opened this issue Nov 3, 2021 · 4 comments 路 Fixed by #624
Closed

MAP handling an empty prediction array. #601

SBoulanger opened this issue Nov 3, 2021 · 4 comments 路 Fixed by #624
Labels
bug / fix Something isn't working help wanted Extra attention is needed

Comments

@SBoulanger
Copy link

馃悰 Bug

When there are no predictions in an update, the MAP class returns a ValueError. Although in the case where no predictions are made and there are targets, the False Negative count should rise and make Recall worse, yet the MAP class does not seem to handle this case.

To Reproduce

Run the code sample in metrics/detection_map.py but empty the tensors in the preds dictionary.

Code sample

preds = [
    dict(
        # The boxes keyword should contain an [N,4] tensor,
        # where N is the number of detected boxes with boxes of the format
        # [xmin, ymin, xmax, ymax] in absolute image coordinates
        boxes=torch.Tensor([]),
        # The scores keyword should contain an [N,] tensor where
        # each element is confidence score between 0 and 1
        scores=torch.Tensor([]),
        # The labels keyword should contain an [N,] tensor
        # with integers of the predicted classes
        labels=torch.IntTensor([]),
    )
]

Expected behavior

Recall to be affected, not a ValueError.

Environment

  • PyTorch Version (e.g., 1.0):
  • OS (e.g., Linux):
  • How you installed PyTorch (conda, pip, source):
  • Build command you used (if compiling from source):
  • Python version:
  • CUDA/cuDNN version:
  • GPU models and configuration:
  • Any other relevant information:

Additional context

I may be mission something when calculating Recall that is not immediately obvious. I just want to make sure the metrics are correct.

@SBoulanger SBoulanger added bug / fix Something isn't working help wanted Extra attention is needed labels Nov 3, 2021
@github-actions
Copy link

github-actions bot commented Nov 3, 2021

Hi! thanks for your contribution!, great first issue!

@Borda
Copy link
Member

Borda commented Nov 3, 2021

I believe it was solved by #594 anyway feel free to reopen if the issue holds 馃惏

@Borda Borda closed this as completed Nov 3, 2021
@SBoulanger
Copy link
Author

Hey @Borda , I see the change now. Although I am still running into the issue when my tensor is torch.Tensor([ ]). It seems to address the case when a tensor is torch.Tensor([[]]). Should it handle both cases?

@tkupek
Copy link
Contributor

tkupek commented Nov 15, 2021

@SBoulanger thanks. I will take care of this fix.
Until then I recommend to use torch.Tensor([[]])

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants