-
Notifications
You must be signed in to change notification settings - Fork 117
[feat] Produce detailed JSON report for a regression testing session #1377
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
It looks something like this |
ekouts
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have two small comments, that are more questions
|
@rsarm Can you put an example of the output in case of a successful test? |
|
@vkarak I didn't make too much distinction. The only difference is that the key |
|
Apart from @ekouts' comments, here are some additional ones:
We need also the following fields:
|
|
Success: Fail: |
Codecov Report
@@ Coverage Diff @@
## master #1377 +/- ##
==========================================
+ Coverage 91.83% 91.93% +0.10%
==========================================
Files 82 82
Lines 12801 12898 +97
==========================================
+ Hits 11756 11858 +102
+ Misses 1045 1040 -5
Continue to review full report at Codecov.
|
ekouts
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
|
@jenkins-cscs retry daint |
1 similar comment
|
@jenkins-cscs retry daint |
vkarak
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The core part of the PR is fine, except a couple of really minor omissions. We need to think a bit more on the CLI part and how we export the functionality to the user.
|
@jenkins-cscs retry all |
|
@rsarm Even the generic unit tests from Travis are failing. |
|
The same if I run them locally. Can you please fix them? |
|
Ok, I see the problem. You don't handle correctly the case where the |
|
But the unit tests must not create artifacts outside the temp directory that they are using. So just creating that directory is not acceptable. It needs further thinking. |
vkarak
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will fix the PR.
|
@rsarm We will need also some documentation for this. For the moment we won't need a detailed description of what the report contains. |
|
Hello @rsarm, Thank you for updating! Cheers! There are no PEP8 issues in this Pull Request!Do see the ReFrame Coding Style Guide Comment last updated at 2020-07-24 09:55:50 UTC |
- Also a minor fix in the report name generation
Closes #667
EDIT from @vkarak:
This PR adds to ReFrame the capability of producing a detailed JSON report whenever it runs. By default, the report is placed under
$HOME/.reframe/reports. Each new ReFrame run generates a new report. A command line argument, configuration parameter and environment variable controlling this are added. All other reports generated from ReFrame should use this JSON. This is done already for the standard failure report but it is left for future work for the current performance report (the performance data of tests is already in the JSON). Here is an example output runningStreamTest:TODO: