Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in accumulate: math_matches error: Assertion error in line 157 #19

Closed
patrontheo opened this issue Mar 1, 2024 · 4 comments
Closed
Labels
bug Something isn't working

Comments

@patrontheo
Copy link

Describe the bug
cocoEval.accumulate() is not working. It returns this error:

 math_matches error: 
Traceback (most recent call last):
  File "/home/xxx/mambaforge/envs/fiftyone/lib/python3.10/site-packages/faster_coco_eval/core/faster_eval_api.py", line 157, in accumulate
    assert self.detection_matches.shape[1] == len(self.cocoDt.anns)
AssertionError

To Reproduce

from faster_coco_eval import COCO, COCOeval_faster
coco_gt = COCO('gt.json')
coco_dt = coco_gt.loadRes('preds.json')

cocoEval = COCOeval_faster(coco_gt, coco_dt, iouType='bbox')
cocoEval.evaluate()
cocoEval.accumulate()

Expected behavior
This command should work without errors. It does work correctly with pycocotools implementation.

Additional context
Could this happen because an image contains no ground truths ? Just an idea I'm throwing, not sure it's related.

Files
gt.json
preds.json

@MiXaiLL76
Copy link
Owner

Thanks for the issue!
This error is caused by the fact that I incorrectly specified the extra_calc operation criteria.

I'll fix it within an hour and post a new version!

@MiXaiLL76 MiXaiLL76 mentioned this issue Mar 4, 2024
Merged
1 task
@MiXaiLL76
Copy link
Owner

MiXaiLL76 commented Mar 4, 2024

FIX ff6e336

without extra_calc

from faster_coco_eval import COCO, COCOeval_faster
coco_gt = COCO('gt.json')
coco_dt = coco_gt.loadRes('preds.json')

cocoEval = COCOeval_faster(coco_gt, coco_dt, iouType='bbox')
cocoEval.evaluate()
cocoEval.accumulate()
cocoEval.summarize()

print(cocoEval.stats_as_dict)

{'AP_all': 0.844909645287565, 'AP_50': 0.9574501792007959, 'AP_75': 0.9422624028756038, 'AP_small': 0.6037607333485913, 'AP_medium': 0.8790270967302074, 'AP_large': 0.9185868759796408, 'AR_1': 0.17954286408399003, 'AR_10': 0.738941830046937, 'AR_100': 0.9033280920110436, 'AR_small': 0.7773887684368035, 'AR_medium': 0.9255266295014362, 'AR_large': 0.9589629629629629, 'AR_50': 0.9933915046286044, 'AR_75': 0.9782595186036742}

with extra_calc

from faster_coco_eval import COCO, COCOeval_faster
coco_gt = COCO('gt.json')
coco_dt = coco_gt.loadRes('preds.json')

cocoEval = COCOeval_faster(coco_gt, coco_dt, iouType='bbox', extra_calc=True)
cocoEval.evaluate()
cocoEval.accumulate()
cocoEval.summarize()

print(cocoEval.stats_as_dict)

{'AP_all': 0.844909645287565, 'AP_50': 0.9574501792007959, 'AP_75': 0.9422624028756038, 'AP_small': 0.6037607333485913, 'AP_medium': 0.8790270967302074, 'AP_large': 0.9185868759796408, 'AR_1': 0.17954286408399003, 'AR_10': 0.738941830046937, 'AR_100': 0.9033280920110436, 'AR_small': 0.7773887684368035, 'AR_medium': 0.9255266295014362, 'AR_large': 0.9589629629629629, 'AR_50': 0.9933915046286044, 'AR_75': 0.9782595186036742, 'mIoU': 0.9211815696510985, 'mAUC_50': 0.99020986735309}

Please write after checking the results and I will close the issue

@MiXaiLL76
Copy link
Owner

The new version of the library is already on pypi

@MiXaiLL76 MiXaiLL76 added the bug Something isn't working label Mar 5, 2024
@patrontheo
Copy link
Author

Thanks for the fix!
It also works on my side, and the computed metrics seem to match the ones from pycocotools :).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants