Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suppress UndefinedMetric Warning for F1/precision/recall #671

Merged
merged 8 commits into from
Apr 17, 2020

Conversation

dsherry
Copy link
Contributor

@dsherry dsherry commented Apr 17, 2020

Fix #436 , builds off #588 .

These warnings occur when the denominator in the computation of these quantities is 0. In that case we should just return 0 F1/precision/recall and not warn. Sklearn by default returns 0 but also warns.

@dsherry dsherry force-pushed the ds_436_suppress_undefinedmetric branch from 11341f5 to 3494584 Compare April 17, 2020 02:36
@@ -47,15 +47,14 @@ def objective_function(self, y_predicted, y_true, X=None):
return metrics.balanced_accuracy_score(y_true, y_predicted)


# todo does this need tuning?
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not directly related to this PR but I just noticed this and thought it should be deleted. FYI @kmax12

@codecov
Copy link

codecov bot commented Apr 17, 2020

Codecov Report

Merging #671 into master will increase coverage by 0.01%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #671      +/-   ##
==========================================
+ Coverage   99.04%   99.05%   +0.01%     
==========================================
  Files         139      139              
  Lines        4810     4882      +72     
==========================================
+ Hits         4764     4836      +72     
  Misses         46       46              
Impacted Files Coverage Δ
evalml/objectives/standard_metrics.py 99.54% <100.00%> (ø)
...lml/tests/objective_tests/test_standard_metrics.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9848eff...82675a7. Read the comment docs.

@dsherry dsherry marked this pull request as ready for review April 17, 2020 03:03
Copy link
Contributor

@angela97lin angela97lin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, though I see there are a lot of tests for standard metrics added (yay!) for accuracy as well; are those intentionally in this PR since you're adding other tests anyways?

Copy link
Collaborator

@jeremyliweishih jeremyliweishih left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@dsherry dsherry merged commit 1273b44 into master Apr 17, 2020
@dsherry dsherry deleted the ds_436_suppress_undefinedmetric branch April 17, 2020 14:31
@dsherry
Copy link
Contributor Author

dsherry commented Apr 17, 2020

@angela97lin yep! I added more test coverage for the objectives affected by this PR. I hope we can follow this pattern with all the objectives we add eventually.

@dsherry dsherry mentioned this pull request Apr 17, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Suppress sklearn UndefinedMetric warning from stdout (F1 score)
4 participants