Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Balanced Accuracy Metric #612

Merged
merged 8 commits into from Apr 13, 2020
Merged

Add Balanced Accuracy Metric #612

merged 8 commits into from Apr 13, 2020

Conversation

gsheni
Copy link
Member

@gsheni gsheni commented Apr 9, 2020

Pull Request Description

@gsheni gsheni self-assigned this Apr 9, 2020
@gsheni gsheni changed the title Add Balanced Accuracy Metric [WIP] Add Balanced Accuracy Metric Apr 9, 2020
@dsherry
Copy link
Collaborator

dsherry commented Apr 13, 2020

@gsheni do you have time to finish this off this week?

@angela97lin just merged changes to how we define objectives, adding
a BinaryClassificationObjective subclass. Check out her PR #624 for an example on how to update this PR after those changes. Specifically, you'll now need to subclass BinaryClassificationObjective

evalml/objectives/utils.py Outdated Show resolved Hide resolved
@gsheni
Copy link
Member Author

gsheni commented Apr 13, 2020

@dsherry
Yah I can get to it this week.

@codecov
Copy link

codecov bot commented Apr 13, 2020

Codecov Report

Merging #612 into master will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #612   +/-   ##
=======================================
  Coverage   98.96%   98.96%           
=======================================
  Files         133      133           
  Lines        4534     4545   +11     
=======================================
+ Hits         4487     4498   +11     
  Misses         47       47           
Impacted Files Coverage Δ
evalml/objectives/__init__.py 100.00% <ø> (ø)
evalml/objectives/utils.py 94.44% <ø> (ø)
evalml/objectives/standard_metrics.py 99.53% <100.00%> (+0.01%) ⬆️
evalml/tests/objective_tests/test_objectives.py 96.29% <100.00%> (ø)
...lml/tests/objective_tests/test_standard_metrics.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 4e49258...fa3ad50. Read the comment docs.

@gsheni gsheni changed the title [WIP] Add Balanced Accuracy Metric Add Balanced Accuracy Metric Apr 13, 2020
Copy link
Collaborator

@dsherry dsherry left a comment

@gsheni if you check out #624 you'll see Angela added a "standard metrics" unit test file. Can you add something in there too? Other than that, looks great.

@dsherry
Copy link
Collaborator

dsherry commented Apr 13, 2020

@gsheni I took a quick look at the codecov report and didn't understand why it was failing for this PR. Feel free to ping #evalml-dev if that's still failing after your latest push and you'd like some help!

@gsheni gsheni merged commit f28b876 into master Apr 13, 2020
2 checks passed
@gsheni gsheni deleted the balanced_accuracy branch Apr 13, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Balanced Accuracy Metric
2 participants