Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data validation two ways #29

Closed
caufieldjh opened this issue Sep 10, 2021 · 3 comments · Fixed by #145
Closed

Data validation two ways #29

caufieldjh opened this issue Sep 10, 2021 · 3 comments · Fixed by #145
Labels
enhancement New feature or request

Comments

@caufieldjh
Copy link
Collaborator

Describe the desired behavior

Would like to verify transforms in at least the following two manners:

  • Check for changes above a certain threshold, e.g., too many changes to new version
  • Do unit tests to verify certain axiom-dependent patterns are being represented as expected (but will require pre-processing to relax first)
@caufieldjh caufieldjh added the enhancement New feature or request label Sep 10, 2021
@caufieldjh
Copy link
Collaborator Author

This may be easier if coupled with #30

@caufieldjh caufieldjh mentioned this issue Sep 23, 2021
@caufieldjh
Copy link
Collaborator Author

Doing a full diff for every update can be slow, but we can run sanity checks:

  • Is the new version empty, particularly after ROBOT processing? (already do this)
  • Does the new version involve a different set of imports from the last?
  • Is the new version >10% different from the previous in the total number of lines?

@caufieldjh
Copy link
Collaborator Author

The first point is covered in #125

@caufieldjh caufieldjh linked a pull request Dec 1, 2021 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant