Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ZeroDivisionError: division by zero #22

Closed
WenFuLee opened this issue Apr 18, 2019 · 7 comments
Closed

ZeroDivisionError: division by zero #22

WenFuLee opened this issue Apr 18, 2019 · 7 comments

Comments

@WenFuLee
Copy link

WenFuLee commented Apr 18, 2019

When doing an evaluation on the test set, I got the following error. I don't have any clue about it.

2019-04-18 03:56:17,288 | upsnet_end2end_test.py | line 307: unified pano result:
Traceback (most recent call last):
  File "upsnet/upsnet_end2end_test.py", line 316, in <module>
    upsnet_test()
  File "upsnet/upsnet_end2end_test.py", line 308, in upsnet_test
    test_dataset.evaluate_panoptic(test_dataset.get_unified_pan_result(all_ssegs, all_panos, all_pano_cls_inds, stuff_area_limit=config.test.panoptic_stuff_area_limit), os.path.join(final_output_path, 'results', 'pans_unified'))
  File "upsnet/../upsnet/dataset/base_dataset.py", line 333, in evaluate_panoptic
    results = pq_compute(gt_json, pred_json, gt_pans, pred_pans, categories)
  File "upsnet/../upsnet/dataset/base_dataset.py", line 301, in pq_compute
    results[name], per_class_results = pq_stat.pq_average(categories, isthing=isthing)
  File "upsnet/../upsnet/dataset/base_dataset.py", line 97, in pq_average
    return {'pq': pq / n, 'sq': sq / n, 'rq': rq / n, 'n': n}, per_class_results
ZeroDivisionError: division by zero

But, the panoptic segmentation results can be successfully generated like below.
lindau_000000_000019

So, how is that possible that I run into the case n = 0? Any idea would be appreciated...Thanks.

@YuwenXiong
Copy link
Contributor

please check #11 (comment)

@WenFuLee
Copy link
Author

I followed #11 (comment), but the solutions in the comments didn't work out for me.

@WenFuLee
Copy link
Author

This issue was solved according to #11 .

@discretecoder
Copy link

I am also having this error with my own dataset.
how can I resolve the issue

@rlangefe
Copy link

I ran into this issue too when trying to use my own dataset. Did anyone find a solution to this?

@RYYAI
Copy link

RYYAI commented Apr 14, 2021

Same problem here as well.

@rlangefe
Copy link

We fixed this just yesterday on our dataset. We found the issue stemmed from how we were constructing the labels and annotation png files. We needed to make sure that the labels matched and each instance had a unique id. We also had to make sure they were encoded correctly by using the formula from COCO: id = R + G256 + B256^2. Once we fixed this, we actually get true positives in the confusion matrix, so we don't have issues like this anymore. Hope this is clear and helps. Feel free to ask more questions if this doesn't make sense. @RYYAI

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants