Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic Test Generation and Parameterization #86

Closed
2 of 3 tasks
vivekjoshy opened this issue Jul 6, 2023 · 1 comment
Closed
2 of 3 tasks

Automatic Test Generation and Parameterization #86

vivekjoshy opened this issue Jul 6, 2023 · 1 comment
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@vivekjoshy
Copy link
Owner

vivekjoshy commented Jul 6, 2023

Is your feature request related to a problem? Please describe.
When a model is rewritten or improved, due to changes internally, the expected API outputs will change significantly. Re-entering the correct values into the test suite to verify determinism is wasted effort on the developer's part long term.

Describe the solution you'd like
Use Hypothesis to generate tests and pytest parameterization.

Tasks:

  • Decouple benchmarks into its own top-level package for loading different kinds of data.
  • Create a command line utility to run benchmarks and regenerate tests.
  • Import benchmarks package to load data for testing purposes.
@vivekjoshy vivekjoshy added the enhancement New feature or request label Jul 6, 2023
@vivekjoshy vivekjoshy added this to the v5.0.0 milestone Jul 6, 2023
@vivekjoshy vivekjoshy self-assigned this Jul 6, 2023
@vivekjoshy vivekjoshy mentioned this issue Jul 6, 2023
2 tasks
@vivekjoshy
Copy link
Owner Author

Using hypothesis offline to help generate tests is a good idea. It will not be added to the test suite though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant