Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AUROC reported wrong value when called differently #1989

Open
SinChee opened this issue Aug 9, 2023 · 1 comment · May be fixed by #1676
Open

AUROC reported wrong value when called differently #1989

SinChee opened this issue Aug 9, 2023 · 1 comment · May be fixed by #1676
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.0.x

Comments

@SinChee
Copy link

SinChee commented Aug 9, 2023

🐛 Bug

AUROC calculation is different when being called differently.

To Reproduce

Copy the following code and run.

Note that when .update() is called, values that are greater than 1.0 will be scaled using the sigmoid activation, values that is less than 1.0 will not be scaled. This result in the compute step to compute with the wrong input values. However, directly calling the auroc will not rescale the value

from torchmetrics import AUROC
from torch import tensor

pp = tensor([tensor([1.2]), tensor([0.9678]), tensor([1.7])])
tt = tensor([tensor([1]), tensor([0]), tensor([1])])

auroc = AUROC(num_classes=1, task="binary")
print("Forward:", auroc(pp, tt))


for p, t in zip(pp, tt):
    auroc.update(p, t)
    print(auroc.preds, auroc.target)

print("Update and Compute:", auroc.compute())

Expected behavior

The reported AUROC metrics for both cases should be the same.

Environment

  • TorchMetrics 1.0.3
  • Python 3.9.16
  • PyTorch 1.13.1
@SinChee SinChee added bug / fix Something isn't working help wanted Extra attention is needed labels Aug 9, 2023
@github-actions
Copy link

github-actions bot commented Aug 9, 2023

Hi! thanks for your contribution!, great first issue!

@SkafteNicki SkafteNicki linked a pull request Aug 9, 2023 that will close this issue
27 tasks
@Borda Borda added the v1.0.x label Aug 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.0.x
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants
@Borda @SinChee and others