Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Micro average for auroc #104

Closed
FlorianMF opened this issue Mar 17, 2021 · 0 comments 路 Fixed by #110
Closed

Micro average for auroc #104

FlorianMF opened this issue Mar 17, 2021 · 0 comments 路 Fixed by #110
Assignees
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@FlorianMF
Copy link

FlorianMF commented Mar 17, 2021

馃殌 Feature

Add 'micro' to list of allowed averages.

Motivation

For now, 'average' must be in [None, 'macro', 'weighted].
For multi-label classification it would be nice to allow micro averaging.
This comes down to calculating auroc(preds.flatten(), target.flatten()).

Similarly to https://scikit-learn.org/stable/modules/generated/sklearn.metrics.roc_auc_score.html.
One could also consider adding 'samples' to the list of accepted averages.

What do you think?

@FlorianMF FlorianMF added enhancement New feature or request help wanted Extra attention is needed labels Mar 17, 2021
@SkafteNicki SkafteNicki self-assigned this Mar 23, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants