New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[LabelModel] Use classification_report for the LabelModel.score method #995
Comments
dcfidalgo
added
type: documentation
Improvements or additions to documentation
labeling
labels
Jan 18, 2022
Can we close this PR ????? |
Naa, still pending, it's on my todo, I'll try to open a PR this week. |
dcfidalgo
pushed a commit
that referenced
this issue
Feb 16, 2022
…res (#1150) * refactor: use classification_report to compute label model metrics * test: adapt tests * feat: default to dict output * test: fix test * docs: show output_str argument in guide, improve model evaluations * test: add small test * fix: add missing import
frascuchon
pushed a commit
that referenced
this issue
Mar 3, 2022
…res (#1150) * refactor: use classification_report to compute label model metrics * test: adapt tests * feat: default to dict output * test: fix test * docs: show output_str argument in guide, improve model evaluations * test: add small test * fix: add missing import (cherry picked from commit fbff02d)
frascuchon
pushed a commit
that referenced
this issue
Mar 4, 2022
…res (#1150) * refactor: use classification_report to compute label model metrics * test: adapt tests * feat: default to dict output * test: fix test * docs: show output_str argument in guide, improve model evaluations * test: add small test * fix: add missing import (cherry picked from commit fbff02d)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Return the str of the
sklearn.metrics.classification_report
by default, add anoutput_dict
parameter to be able to output a dict instead.Show this feature in our wl guide/tutorial.
The text was updated successfully, but these errors were encountered: