Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion matrix weird results #2964

Closed
constantinfite opened this issue Apr 28, 2021 · 5 comments
Closed

Confusion matrix weird results #2964

constantinfite opened this issue Apr 28, 2021 · 5 comments
Labels
question Further information is requested Stale

Comments

@constantinfite
Copy link

❔Question

Hi, I remove the line array = self.matrix / (self.matrix.sum(0).reshape(1, self.nc + 1) + 1E-6) # normalize at the line 164 in the file metrics.py to obtain the real number of true positive, true negative ...

When I don't set the confidence score threshold it gives me this confusion matrix

confusion_matrix

These results match with my data I have 613 objects (sharks) to detect ( 551 are detected (TP) and 62 are not detected (FN))

Now when I set the confidence score threshold at 0.7 --conf 0.7 it gives me this matrix
confusion_matrix

What I don't understand is that when I sum the TP and the TN for the class shark this time it gives me 448+39 = 487 and not 613.

Do you have any idea why it returns this result ?
Thanks!

@constantinfite constantinfite added the question Further information is requested label Apr 28, 2021
@xinxin342
Copy link

@constantinfite That's interesting, but it doesn't make much sense.
Did you check it again?

@constantinfite
Copy link
Author

@xinxin342 yes I checked again and got the same result ...

@github-actions
Copy link
Contributor

github-actions bot commented Jun 4, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@zxcamazing
Copy link

I have the same confusion. What a confusion matrix!

@glenn-jocher
Copy link
Member

@zxcamazing i understand the confusion here. It seems like you're experiencing unexpected results with the generated confusion matrix. The change you made to the metrics.py file may have altered the calculation behavior and led to the discrepancy in the results you're observing.

When considering a confidence score threshold, it's important to note that this would impact the number of true positives, false positives, etc., as detections below the set threshold would not be accounted for.

To fully understand the issue and provide the best guidance, I'd recommend taking a step back and reconsidering the modifications made to the metrics.py file. Double-checking the implementation might reveal why the results differ when a confidence score threshold is set.

In case you need further assistance, our comprehensive documentation at https://docs.ultralytics.com/yolov5/ can be helpful for reference.

Let me know how it goes!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Stale
Projects
None yet
Development

No branches or pull requests

4 participants