Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update fbeta_score to accept beta=0 #13231

Merged
merged 7 commits into from Jul 19, 2019
Merged

Update fbeta_score to accept beta=0 #13231

merged 7 commits into from Jul 19, 2019

Conversation

corona10
Copy link
Contributor

@corona10 corona10 commented Feb 23, 2019

Reference Issues/PRs

Fixes: #13224

What does this implement/fix? Explain your changes.

Update fbeta_score API to accept beta=0

Any other comments?

I am the first-time contributor for this project.
Please let me know if I make mistakes!
Thanks

@corona10
Copy link
Contributor Author

@corona10 corona10 commented Feb 23, 2019

@qinhanmin2014
Hi, Can you take a look please?

IMHO, Current failure does not relate with this PR
>>> from sklearn.feature_extraction import image 343 >>> # Use the array data from the first image in this dataset: 344

Thanks a lot!

Copy link
Member

@qinhanmin2014 qinhanmin2014 left a comment

please update the doc accordingly, e.g., beta : float >= 0
ignore the test failure if it's irrelevant

@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Feb 24, 2019

Please add an entry to the change log at doc/whats_new/v*.rst. Like the other entries there, please reference this pull request with :issue: and credit yourself (and other contributors if applicable) with :user:.

doc/whats_new/v0.21.rst Outdated Show resolved Hide resolved
doc/whats_new/v0.21.rst Outdated Show resolved Hide resolved
@corona10
Copy link
Contributor Author

@corona10 corona10 commented Feb 24, 2019

@qinhanmin2014 Thanks for the kind review!

Copy link
Member

@qinhanmin2014 qinhanmin2014 left a comment

LGTM, thanks @corona10

@@ -205,10 +205,12 @@ def test_precision_recall_f_binary_single_class():
assert_equal(1., precision_score([1, 1], [1, 1]))
assert_equal(1., recall_score([1, 1], [1, 1]))
assert_equal(1., f1_score([1, 1], [1, 1]))
assert_equal(1., fbeta_score([1, 1], [1, 1], 0))
Copy link
Member

@qinhanmin2014 qinhanmin2014 Feb 25, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to update the comment above : Test precision, recall and F1 score -> Test precision, recall and fscore or something you like.

Copy link
Member

@qinhanmin2014 qinhanmin2014 Feb 28, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to update the comment above : Test precision, recall and F1 score -> Test precision, recall and fscore or something you like

Where's this change? You don't need to squash your commits.
Maybe also add assert_equal(XXX, fbeta_score([1, 1], [1, 1], float('inf'))) here.

Copy link
Contributor Author

@corona10 corona10 Feb 28, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@qinhanmin2014 Looks like there's some accident during I worked on another machine.

Copy link
Member

@qinhanmin2014 qinhanmin2014 Feb 28, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe also add assert_equal(XXX, fbeta_score([1, 1], [1, 1], float('inf'))) here.

Copy link
Contributor Author

@corona10 corona10 Feb 28, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@qinhanmin2014
aa90c28
I've added the test for that case.
Due to https://bugs.python.org/issue28579, I use numpy.isnan method for this test.

Copy link
Member

@qinhanmin2014 qinhanmin2014 Feb 28, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

assert_true is deprecated, use bare assert instead

Copy link
Contributor Author

@corona10 corona10 Mar 1, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@corona10
Copy link
Contributor Author

@corona10 corona10 commented Feb 25, 2019

@qinhanmin2014
Done! Thanks!

Copy link
Member

@jnothman jnothman left a comment

If we're supporting beta=0, we should support beta=inf. Can we please add a test for that?

@corona10
Copy link
Contributor Author

@corona10 corona10 commented Feb 28, 2019

@jnothman Sure!

@corona10
Copy link
Contributor Author

@corona10 corona10 commented Feb 28, 2019

@corona10 corona10 changed the title Update fbeta_score to accept beta=0 [WIP] Update fbeta_score to accept beta=0 Feb 28, 2019
@corona10 corona10 changed the title [WIP] Update fbeta_score to accept beta=0 Update fbeta_score to accept beta=0 Feb 28, 2019
@qinhanmin2014
Copy link
Member

@qinhanmin2014 qinhanmin2014 commented Feb 28, 2019

So actually we do not support beta=inf currently (See the new test). Do you think we need to add support to that @jnothman ? I'm +0 so you can make the decision here :)

@jnothman
Copy link
Member

@jnothman jnothman commented Feb 28, 2019

I'm +0 for beta=0. But if we support beta=0 we should support beta=inf!

@jnothman
Copy link
Member

@jnothman jnothman commented May 1, 2019

Note that if you keep working on this, the what's new entry should move to v0.22.rst as v0.21 has been closed.

@rth
Copy link
Member

@rth rth commented Jul 18, 2019

@corona10 Would you be interested in continuing this PR?

You would need to resolve merge conflicts and handle the case of beta=inf. I can review this one that is done.

@corona10
Copy link
Contributor Author

@corona10 corona10 commented Jul 19, 2019

@rth Thanks! I will take a look

@corona10 corona10 changed the title Update fbeta_score to accept beta=0 [WIP] Update fbeta_score to accept beta=0 Jul 19, 2019
@corona10 corona10 force-pushed the gh-13224 branch 2 times, most recently from 92f153e to 4663d09 Compare Jul 19, 2019
@corona10 corona10 changed the title [WIP] Update fbeta_score to accept beta=0 Update fbeta_score to accept beta=0 Jul 19, 2019
@corona10
Copy link
Contributor Author

@corona10 corona10 commented Jul 19, 2019

@rth cc @jnothman
Please take a look

Copy link
Member

@rth rth left a comment

Thanks! A last few comments. Also please update in the beta docstring, that it needs to be positive.

Otherwise LGTM.

sklearn/metrics/classification.py Outdated Show resolved Hide resolved
sklearn/metrics/tests/test_classification.py Show resolved Hide resolved
doc/whats_new/v0.21.rst Outdated Show resolved Hide resolved
@corona10
Copy link
Contributor Author

@corona10 corona10 commented Jul 19, 2019

@rth
I've updated the PR, please take a look

@@ -1067,7 +1067,7 @@ def fbeta_score(y_true, y_pred, beta, labels=None, pos_label=1,
The `beta` parameter determines the weight of recall in the combined
score. ``beta < 1`` lends more weight to precision, while ``beta > 1``
favors recall (``beta -> 0`` considers only precision, ``beta -> inf``
favors recall (``beta -> 0`` considers only precision, ``beta -> +inf``
Copy link
Contributor Author

@corona10 corona10 Jul 19, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rth Is this the docstring you mean?
if not please let me know what docstring should be updated.
Thanks!

Copy link
Member

@rth rth Jul 19, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean the,

beta : float
        Weight of precision in harmonic mean.

but nevermind, this description is enough.

rth
rth approved these changes Jul 19, 2019
Copy link
Member

@rth rth left a comment

Thanks @corona10 !

Merging with a +2, despite Joel's +0. I don't think this is too controversial.

@rth rth merged commit b4c1c4e into scikit-learn:master Jul 19, 2019
17 checks passed
@corona10 corona10 deleted the gh-13224 branch Jul 19, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants