Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MRG] API Replace scorer brier_score_loss with neg_brier_score #14898

Merged
merged 47 commits into from Sep 19, 2019

Conversation

stefan-matcovici
Copy link
Contributor

@stefan-matcovici stefan-matcovici commented Sep 6, 2019

Reference Issues/PRs
Fixes #13887

What does this implement/fix? Explain your changes.
Introduced new scorer neg_brier_score to reflect the real behavior of the scorer
Added deprecation of brier_loss_score (not sure about the versions mentioned in the deprecation message 馃槙)
Updated documentation to reflect changes

Any other comments?
cloned from #14123

Bharat123rox and others added 30 commits May 21, 2019
鈥nto pr/13918

# Conflicts:
#	sklearn/metrics/tests/test_score_objects.py
- change neg_brier_score to neg_brier_score_loss
- change neg_brier_score to neg_brier_score_loss
鈥-learn into pr/13918

锟 Conflicts:
锟	sklearn/metrics/tests/test_score_objects.py
@jnothman
Copy link
Member

@jnothman jnothman commented Sep 10, 2019

@stefan-matcovici
Copy link
Contributor Author

@stefan-matcovici stefan-matcovici commented Sep 10, 2019

What do you mean by

look up deprecated scorers differently in get_scorer

The only way I see this happening is by checking for all the scorer objects declared in scorer.py or by creating a new private dict containing deprecated scorers

What if we go with the first solution proposed by @jnothman?

I think it would be better if we kept SCORERS as is

The error message remains the same meaning that the user should check SCORERS dict for valid scorers. The way I implemented now valid_scorers returns non-deprecated scorers. Deprecated scorers are not invalid: when using them the user will get the warning that this scorer will get deprecated in near future

doc/whats_new/v0.22.rst Outdated Show resolved Hide resolved
@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Sep 10, 2019

The error message remains the same meaning that the user should check SCORERS dict for valid scorers. The way I implemented now valid_scorers returns non-deprecated scorers. Deprecated scorers are not invalid: when using them the user will get the warning that this scorer will get deprecated in near future

Let's remove 'brier_score_loss' from SCORERS

The only way I see this happening is by checking for all the scorer objects declared in scorer.py or by creating a new private dict containing deprecated scorers

we only need to handle brier_score_loss seperately at the beginning.

Copy link
Member

@jnothman jnothman left a comment

Otherwise lgtm

sklearn/metrics/scorer.py Show resolved Hide resolved
@@ -61,7 +61,7 @@ Scoring Function
'accuracy' :func:`metrics.accuracy_score`
'balanced_accuracy' :func:`metrics.balanced_accuracy_score`
'average_precision' :func:`metrics.average_precision_score`
'brier_score_loss' :func:`metrics.brier_score_loss`
'neg_brier_score_loss' :func:`metrics.brier_score_loss`
Copy link
Member

@jnothman jnothman Sep 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think 'neg_brier_score` might be a more friendly name. neg already implies that the score would otherwise decrease with improvement, and "Brier score loss" is not something known in the literature: it appears 70 times on Google not associated with scikit-learn.

Copy link
Member

@qinhanmin2014 qinhanmin2014 Sep 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think 'neg_brier_score` might be a more friendly name. neg already implies that the score would otherwise decrease with improvement,

Maybe neg_brier_score_loss is a better name? because the name of the function is brier_score_loss and we also have other similar scorers like "neg_log_loss"? @jnothman

and "Brier score loss" is not something known in the literature

If so, we should change the name of the function.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you explain the definition of this neg_brier_score? look at my post please: https://stackoverflow.com/questions/62853440/neg-brier-score-returns-negative-score

sklearn/metrics/scorer.py Show resolved Hide resolved
doc/whats_new/v0.22.rst Outdated Show resolved Hide resolved
@@ -61,7 +61,7 @@ Scoring Function
'accuracy' :func:`metrics.accuracy_score`
'balanced_accuracy' :func:`metrics.balanced_accuracy_score`
'average_precision' :func:`metrics.average_precision_score`
'brier_score_loss' :func:`metrics.brier_score_loss`
'neg_brier_score_loss' :func:`metrics.brier_score_loss`
Copy link
Member

@qinhanmin2014 qinhanmin2014 Sep 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think 'neg_brier_score` might be a more friendly name. neg already implies that the score would otherwise decrease with improvement,

Maybe neg_brier_score_loss is a better name? because the name of the function is brier_score_loss and we also have other similar scorers like "neg_log_loss"? @jnothman

and "Brier score loss" is not something known in the literature

If so, we should change the name of the function.

sklearn/metrics/scorer.py Show resolved Hide resolved
@jnothman
Copy link
Member

@jnothman jnothman commented Sep 10, 2019

@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Sep 10, 2019

We have the _loss suffix to clarify that smaller is better. But unlike
this, log loss is called "log loss".

The problem is that we add the _loss suffix to the name of the function, so maybe we should add it to the name of the scorer.

@jnothman
Copy link
Member

@jnothman jnothman commented Sep 10, 2019

@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Sep 10, 2019

We drop _score from other metric names. I don't think there's any way to
make this consistent, only readable.

OK, I don't think I'll argue with you about this. Let's use neg_brier_score unless other core devs disagree.

@stefan-matcovici please take some time to update the PR, thanks.

@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Sep 14, 2019

Do you have time to finish this one? @stefan-matcovici thanks.

@stefan-matcovici
Copy link
Contributor Author

@stefan-matcovici stefan-matcovici commented Sep 15, 2019

yes I do

@stefan-matcovici stefan-matcovici changed the title [MRG] API Replace scorer brier_score_loss with neg_brier_score_loss [MRG] API Replace scorer brier_score_loss with neg_brier_score Sep 16, 2019
Copy link
Member

@jnothman jnothman left a comment

Looking at it again, I wonder if we further want to simplify the scorer strong to "neg_brier"... Seems more consistent still. Any other opinions?

@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Sep 17, 2019

Looking at it again, I wonder if we further want to simplify the scorer strong to "neg_brier"... Seems more consistent still. Any other opinions?

Why is it more consistent? According to google, "brier score" is a term but "brier" is not. (Similarly, "precision" is a term but "precision score" is not.)

@jnothman
Copy link
Member

@jnothman jnothman commented Sep 17, 2019

@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Sep 17, 2019

tests are failing @stefan-matcovici , otherwise LGTM

Copy link
Member

@qinhanmin2014 qinhanmin2014 left a comment

thanks

@qinhanmin2014 qinhanmin2014 merged commit d369a5f into scikit-learn:master Sep 19, 2019
18 checks passed
@stefan-matcovici
Copy link
Contributor Author

@stefan-matcovici stefan-matcovici commented Sep 19, 2019

thanks a lot, guys, my first pr 馃榿

@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Sep 19, 2019

thanks for your patience :)

@jnothman
Copy link
Member

@jnothman jnothman commented Sep 19, 2019

Thanks for helping us finally close this, @stefan-matcovici, and congrats on the merge!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants