-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Suppress UndefinedMetric Warning for F1/precision/recall #671
Conversation
…nd F1 tests. Updated changelog.
11341f5
to
3494584
Compare
@@ -47,15 +47,14 @@ def objective_function(self, y_predicted, y_true, X=None): | |||
return metrics.balanced_accuracy_score(y_true, y_predicted) | |||
|
|||
|
|||
# todo does this need tuning? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not directly related to this PR but I just noticed this and thought it should be deleted. FYI @kmax12
Codecov Report
@@ Coverage Diff @@
## master #671 +/- ##
==========================================
+ Coverage 99.04% 99.05% +0.01%
==========================================
Files 139 139
Lines 4810 4882 +72
==========================================
+ Hits 4764 4836 +72
Misses 46 46
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, though I see there are a lot of tests for standard metrics added (yay!) for accuracy as well; are those intentionally in this PR since you're adding other tests anyways?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@angela97lin yep! I added more test coverage for the objectives affected by this PR. I hope we can follow this pattern with all the objectives we add eventually. |
Fix #436 , builds off #588 .
These warnings occur when the denominator in the computation of these quantities is 0. In that case we should just return 0 F1/precision/recall and not warn. Sklearn by default returns 0 but also warns.