Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Functional Confusion Matrix with Multi-Label #100

Closed
SpontaneousDuck opened this issue Mar 16, 2021 · 5 comments 路 Fixed by #134
Closed

Functional Confusion Matrix with Multi-Label #100

SpontaneousDuck opened this issue Mar 16, 2021 · 5 comments 路 Fixed by #134
Assignees
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Milestone

Comments

@SpontaneousDuck
Copy link

馃悰 Bug

I am trying to analyze a model that has multi-label predictions. When creating a confusion matrix with the functional confusion_matrix method, I get a much different result than expected. I may be misunderstanding how this is supposed to work so any help would be appreciated!

To Reproduce

Steps to reproduce the behavior:

  1. Predict multi-label data that has had torch.sigmoid applied to the output (N,C) and have a matching shape truth data.
  2. Use the functional confusion_matrix method on the data

Code sample

>>> from torchmetrics.functional import confusion_matrix
>>> import torch
>>> x = torch.tensor([[.4,.5,.6,.7],[.3,.4,.7,.1]])
>>> y = torch.tensor([[0,0,0,1],[0,1,0,0]], dtype=torch.int32)
>>> cm = confusion_matrix(x, y, num_classes=4, normalize='none')
tensor([[3., 3., 0., 0.],
        [1., 1., 0., 0.],
        [0., 0., 0., 0.],
        [0., 0., 0., 0.]])

Expected behavior

I would expect the confusion matrix to count the classes that were predicted for each true class. I may be wrong

tensor([[0, 0, 0, 0],
        [0, 0, 1, 0],
        [0, 0, 0, 0],
        [0, 1, 1, 1]])

Environment

  • PyTorch Version (e.g., 1.0): 1.7
  • OS (e.g., Linux): Linux
  • How you installed PyTorch (conda, pip, source): conda
  • Python version: 3.8.8
  • CUDA/cuDNN version: 11.03
  • GPU models and configuration: Nvidia Tesla V100

Thanks for the great project and help!!

@SpontaneousDuck SpontaneousDuck added the help wanted Extra attention is needed label Mar 16, 2021
@github-actions
Copy link

Hi! thanks for your contribution!, great first issue!

@SpontaneousDuck
Copy link
Author

So I guess the more usefull expected behavior would be the Multi-Label Confusion Matrix output like sklearn.metrics.multilabel_confusion_matrix where the output is (n_outputs, 2, 2). It appears like torchmetrics is kind of doing this but extending the 2x2 matrix to the size of num_classes where the multi-label confusion matrix occupies the first two rows and columns. pytorchmetrics is also then summing along the first axis and only returning the overall matrix instead of class-wise.

@FlorianMF
Copy link

I'm in the same case as you.
The docstring of confusion_matrix says that it works for multi-label targets.
It calculates the confusion matrix in a binary way. Using num_classes > 2 or num_classes=2 gives the same result except for the additional rows and colums (all zeros) for the additional classes.

I don't expect the same output as you. In my opinion the output should be a list/Tensor of binary confusion matrices of length num_classes like in this sklearn example

Personally I iterate over the the predictions and targets as _auroc_compute does:

[confusion_matrix(class_preds[:, idx], y_true[:, idx], num_classes=2)
                  for idx in range(class_preds.shape[-1])]

@SpontaneousDuck
Copy link
Author

Thanks for the response! Yeah I look in to sklearn and understand what I should be getting now. I am doing what you suggested to get the response now. confusion_matrix should match sklearn's if it was performing correctly right?

@FlorianMF
Copy link

IMO yes if the philosophy is to create a pytorch optimised sklearn.

@Borda Borda added the bug / fix Something isn't working label Mar 17, 2021
@Borda Borda added this to the 0.3 milestone Mar 25, 2021
@SkafteNicki SkafteNicki self-assigned this Mar 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants