Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add confusion matrix as metric for semantic segmentation #788

Merged
merged 3 commits into from May 29, 2019

Conversation

@lewfish
Copy link
Contributor

commented May 28, 2019

This adds a conf_mat field to the eval metrics generated for semantic segmentation. For each class item, conf_mat is the row in the confusion matrix representing that class. For the average item, conf_mat is the entire matrix.

@lewfish lewfish force-pushed the lf/eval-metric branch from 6f838ce to 73613af May 29, 2019
@lewfish lewfish force-pushed the lf/eval-metric branch from 73613af to 99a9aae May 29, 2019
@lewfish lewfish merged commit 5dfa492 into master May 29, 2019
1 check passed
1 check passed
continuous-integration/travis-ci/pr The Travis CI build passed
Details
@lewfish lewfish deleted the lf/eval-metric branch May 29, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
1 participant
You can’t perform that action at this time.