Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make it possible to exclude "null" class labels from the computation of metrics #1977

Closed
Tracked by #2084
chrisjkuch opened this issue Nov 1, 2023 · 3 comments 路 Fixed by #2088
Closed
Tracked by #2084

Make it possible to exclude "null" class labels from the computation of metrics #1977

chrisjkuch opened this issue Nov 1, 2023 · 3 comments 路 Fixed by #2088

Comments

@chrisjkuch
Copy link

馃殌 Feature

Make it possible to exclude "null" class labels from the computation of metrics.

Motivation

Given the functionality to create a "null" class in the class config and to exclude it from the loss using ignore_class_index in the solver config, I think I would have expected the computation of metrics during evaluation to also exclude the null class, or at least to provide a keyword argument in the solver / learner config or call to train that allows me to specify explicitly that I want to ignore any null labels outright (treating them as unlabelled for the purposes of evaluation). Currently, the precision gets computed as if the null class were valid labels, so the precision (and therefore f1 scores) look much lower than one would expect if one were to only evaluate non-null labelled pixels.

Pitch

Excluding null labels from model evaluation is a nice way to ensure you're seeing the values for metrics as you would intuitively expect, rather than having to correct them. This shouldn't be too difficult to implement.

I'd propose placing an option to ignore null class labels in the solver config, or to have the default behavior be to ignore null class labels for metrics evaluation if ignore_class_index is specified.

Alternatives

Specifying an external loss definition that computes these metrics in the manner suggested.

Additional context

@chrisjkuch chrisjkuch changed the title Make it possible to exclude "null" class labels from the computation of metrics. Make it possible to exclude "null" class labels from the computation of metrics Nov 1, 2023
@AdeelH
Copy link
Collaborator

AdeelH commented Nov 3, 2023

Thanks for the suggestion. This makes sense to me. I suppose we can just drop the row and column of the ignored class from the confusion matrix before computing the metrics.

conf_mat = sum([o['conf_mat'] for o in outputs])
conf_mat_metrics = compute_conf_mat_metrics(conf_mat,
self.cfg.data.class_names)

conf_mat = sum([o['conf_mat'] for o in outputs])
conf_mat_metrics = compute_conf_mat_metrics(conf_mat,
self.cfg.data.class_names)

@AdeelH
Copy link
Collaborator

AdeelH commented Nov 3, 2023

It might take me some days to get around to this though.

@chrisjkuch
Copy link
Author

No rush and no expectation, I have a hacky workaround for now that works as you suggest. Happy to submit a PR but similarly might take some time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants