Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change default metrics for evaluate #949

Closed
amontanez24 opened this issue Aug 10, 2022 · 0 comments · Fixed by #958
Closed

Change default metrics for evaluate #949

amontanez24 opened this issue Aug 10, 2022 · 0 comments · Fixed by #958
Assignees
Labels
feature request Request for a new feature
Milestone

Comments

@amontanez24
Copy link
Contributor

Problem Description

As a user, it is confusing when multi-table metrics are applied to single table data. Also many of the metrics seem to result in errors.

Expected behavior

  • When running evaluate, the following metrics should be applied by default in the single table case
    • CSTest
    • KSTest
  • In the multi-tale case, the multi-table version of the following metrics should be used
    • CSTest
    • KSTest

Additional context

The metrics to use are currently selected in this method:

def _select_metrics(synthetic_data, metrics):

We should change this to only use the metrics described above. One thing to note is that if a user provides metadata and a single table as input, it will crash because of this line
table = list(metadata['tables'].keys())[0]

This is not necessary to fix for this issue, but if it's easy enough then we should make the change

@amontanez24 amontanez24 added feature request Request for a new feature new Automatic label applied to new issues and removed new Automatic label applied to new issues labels Aug 10, 2022
This was referenced Aug 15, 2022
@amontanez24 amontanez24 added this to the 0.17.0 milestone Aug 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Request for a new feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants