Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Module: Classifier Models -> Evaluation + Selection of Preferred #36

Open
jonc101 opened this issue Dec 14, 2017 · 0 comments
Open

Module: Classifier Models -> Evaluation + Selection of Preferred #36

jonc101 opened this issue Dec 14, 2017 · 0 comments

Comments

@jonc101
Copy link
Collaborator

jonc101 commented Dec 14, 2017

  • Input results of issue Module: Feature Matrix -> Multiple Classifier Models #35, have a battery of predictive models / classifiers.
  • Calculate standard evaluation metrics such as ROC AUC, AU Precision-Recall Curve, calibration curve, Precision at top X(%) with respective total report
  • Allow sorting/selection of models by best performance by given metric
@jonc101 jonc101 created this issue from a note in Predicting Lab Results (To Do) Dec 14, 2017
@jonc101 jonc101 moved this from To Do to In Progress in Predicting Lab Results Oct 29, 2018
@jonc101 jonc101 moved this from In Progress to To Do in Predicting Lab Results Feb 1, 2019
@jonc101 jonc101 removed this from To Do in Predicting Lab Results Jul 23, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Standard ML Pipeline
  
Awaiting triage
Development

No branches or pull requests

1 participant