-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: performance results file validation #216
feat: performance results file validation #216
Conversation
🦋 Changeset detectedLatest commit: fa11aee The changes in this PR will be included in the next version bump. This PR includes changesets to release 3 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
Co-authored-by: Maciej Jastrzebski <mdjastrzebski@gmail.com>
Co-authored-by: Maciej Jastrzebski <mdjastrzebski@gmail.com>
Co-authored-by: Maciej Jastrzebski <mdjastrzebski@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The code looks pretty good 👍🏻 I've added some comments about improving the code organisation a bit more.
I think we should also add some basic unit tests that would load a couple of hardcode performance files from disk and check that they match the expected output. The goal here is not not test Zod library, as it probably has it's own tests but rather to make sure that we are still able to run load typical performance results files. I would go with four input files: one with header, one without header, one with incorrect JSON, last one with duplicate entries.
Have resolved all of the code feedback @mdjastrzebski . Will get started on unit testing. |
@mdjastrzebski will it be a good idea to Not very keen on writing tests for the full Another approach can be to write tests for the exported methods from Please suggest. |
@ShaswatPrabhat exporting Also pls do not make the test overly complicated, as simple test fields with/without header row and 2-3 entries should be enough. Then you can put assertions on metadata, total number of entries and values from 1-2 entries per file. In case of single failing test just check the error message. BTW the code looks really good now, I've added some minor comments but I am very happy with the results 🚀 |
Co-authored-by: Maciej Jastrzebski <mdjastrzebski@gmail.com>
Have added 2 basic snapshot tests based on Please let me know if this is similar to what you have in mind @mdjastrzebski As the errors and success are both too large hence have snapshotted them |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ShaswatPrabhat Looks good, I plan on merging it tomorrow. Thank you for contributing this feature.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! Thanks for contributing this feature @ShaswatPrabhat 🚀
Validate performance results files input to compare package using zod