Skip to content

Specify metrics in threshold_perf() #37

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

szego
Copy link

@szego szego commented Jan 17, 2021

Description

So that the user can specify which metrics are computed when calling threshold_perf(), we add a new metrics argument that accepts a yardstick metric set.

By default this argument is NULL, in which case the function computes the default metrics. If not NULL, the function checks that it's an appropriate metric set (class metrics only) and computes only the metrics provided in the set.

Motivation and Context

Fixes #25

How Has This Been Tested?

Tested using yardstick 0.0.7 on R 4.0.2.

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)

Checklist:

  • My code follows the code style of this project.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.

Note that I did not add an example in the docs of threshold_perf() that uses the new feature.

So that the user can specify which metrics are computed when calling threshold_perf(), we add a new `metrics` argument that accepts a `yardstick` metric set.

By default this argument is `NULL`, in which case the function computes the default metrics. If not `NULL`, the function checks that it's an appropriate metric set and computes only the metrics provided in the set.
@szego
Copy link
Author

szego commented Feb 2, 2021

Forgot to note that the code that verifies the passed metric set (lines 106 to 117 in threshold_perf.R) incorporates some code from tune::check_metrics(). See lines 305 to 310 here: https://github.com/tidymodels/tune/blob/a97576f3f3e36f362388de7e2f3ef2df8ab3a38f/R/checks.R#L305

@topepo
Copy link
Member

topepo commented Jan 2, 2023

Sorry for the long delay on this.

Since we were working on similar problems related to calibration I made a separate branch and PR to solve this (but credit you in the news file).

Thanks for getting this going!

@topepo topepo closed this Jan 2, 2023
@topepo topepo mentioned this pull request Jan 2, 2023
@github-actions
Copy link

This pull request has been automatically locked. If you believe you have found a related problem, please file a new issue (with a reprex: https://reprex.tidyverse.org) and link to this issue.

@github-actions github-actions bot locked and limited conversation to collaborators Jan 17, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Is it possible to specify a metric_set within threshold_perf?
2 participants