New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update fbeta_score to accept beta=0 #13231
Conversation
@qinhanmin2014 IMHO, Current failure does not relate with this PR Thanks a lot! |
please update the doc accordingly, e.g., beta : float >= 0
ignore the test failure if it's irrelevant
Please add an entry to the change log at |
@qinhanmin2014 Thanks for the kind review! |
@@ -205,10 +205,12 @@ def test_precision_recall_f_binary_single_class(): | |||
assert_equal(1., precision_score([1, 1], [1, 1])) | |||
assert_equal(1., recall_score([1, 1], [1, 1])) | |||
assert_equal(1., f1_score([1, 1], [1, 1])) | |||
assert_equal(1., fbeta_score([1, 1], [1, 1], 0)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you need to update the comment above : Test precision, recall and F1 score
-> Test precision, recall and fscore
or something you like.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you need to update the comment above :
Test precision, recall and F1 score
->Test precision, recall and fscore
or something you like
Where's this change? You don't need to squash your commits.
Maybe also add assert_equal(XXX, fbeta_score([1, 1], [1, 1], float('inf')))
here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@qinhanmin2014 Looks like there's some accident during I worked on another machine.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe also add assert_equal(XXX, fbeta_score([1, 1], [1, 1], float('inf')))
here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@qinhanmin2014
aa90c28
I've added the test for that case.
Due to https://bugs.python.org/issue28579, I use numpy.isnan method for this test.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
assert_true is deprecated, use bare assert instead
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@qinhanmin2014 done
@qinhanmin2014 |
If we're supporting beta=0, we should support beta=inf. Can we please add a test for that?
@jnothman Sure! |
@qinhanmin2014 PTAL |
So actually we do not support |
I'm +0 for beta=0. But if we support beta=0 we should support beta=inf! |
Note that if you keep working on this, the what's new entry should move to v0.22.rst as v0.21 has been closed. |
@corona10 Would you be interested in continuing this PR? You would need to resolve merge conflicts and handle the case of |
@rth Thanks! I will take a look |
92f153e
to
4663d09
Compare
Thanks! A last few comments. Also please update in the beta
docstring, that it needs to be positive.
Otherwise LGTM.
@rth |
@@ -1067,7 +1067,7 @@ def fbeta_score(y_true, y_pred, beta, labels=None, pos_label=1, | |||
The `beta` parameter determines the weight of recall in the combined | |||
score. ``beta < 1`` lends more weight to precision, while ``beta > 1`` | |||
favors recall (``beta -> 0`` considers only precision, ``beta -> inf`` | |||
favors recall (``beta -> 0`` considers only precision, ``beta -> +inf`` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rth Is this the docstring you mean?
if not please let me know what docstring should be updated.
Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean the,
beta : float
Weight of precision in harmonic mean.
but nevermind, this description is enough.
Reference Issues/PRs
Fixes: #13224
What does this implement/fix? Explain your changes.
Update fbeta_score API to accept beta=0
Any other comments?
I am the first-time contributor for this project.
Please let me know if I make mistakes!
Thanks