Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Analyzer Results] Do all steps of a test rather than failing on an Analyzer Score not being reached #2682

Closed
Tracked by #2615
kdhamric opened this issue Jun 7, 2023 · 4 comments · Fixed by #2697
Assignees
Labels
backend design enhancement New feature or request frontend triage requires triage

Comments

@kdhamric
Copy link
Collaborator

kdhamric commented Jun 7, 2023

Currently, the new analyzer, if enabled, checks to see if the score is below the threshold value and stops the test processing, leaving the test in a state of ANALYZING_ERROR. Rather than doing this, processing the test should continue, progressing through to the FINISHED state.

AC1: When running a test, the processing should not stop
based on the score for the analyzer being below a certain threshold

AC2: If the Analyzer is enabled,
the Trace screen should indicate if the overall score, ie the Trace Analyzer Result,
was below the threshold for a successful run. It should clearly indicate what the overall
Trace Analyzer Score was, and what the minimal acceptable score was at the time this test run was run,
and whether this was a passing test score or not. Suggested wording (although a graphic may work better):
Overall Trace Analyzer Score: 61 Minimum acceptable score: 70. This trace analyzer result is: Failed

We will need other followup issues to make the CLI handle failure to meet the score, but this first change is needed so test specs run properly regardless of the passing or failing of the test score.

@xoscar
Copy link
Collaborator

xoscar commented Jun 7, 2023

@olha23 do you mind taking a look at this, we would need to have a place in the trace tab to show that the analyzer failed because of the minimum score rule

@jorgeepc
Copy link
Contributor

jorgeepc commented Jun 7, 2023

Looks good. One question @kdhamric, we are going to fail the test if the score for the analyzer is below the threshold, right? I mean, we are going to execute the specs but we want to mark the test as failed. Is that correct?

@kdhamric
Copy link
Collaborator Author

kdhamric commented Jun 7, 2023

In the UI, we want to start indicating score information and whether the test met the minimum score. @olha23 has mocks in #2650 that show in the results view the overall score, and the image is in red or green to indicate whether the trace analyzer passed the minimum score.

We do, however, need to mark the entire test run as passed or failed, which is indicated by the green or red circle at the beginning of a test run, based on some logic. Were there any assertions that failed? Is the trace analyzer on, and if so, was the overall score below the minimum? We may have a setting to say 'score the traces, show pass or fail for it, but do not fail the overall test (ie just show scores and whether we passed or not).

Rather than having this logic in the front end and in the CLI, we may want to centralize it and deliver it as part of the test run info. Creating an issue to describe the problem.

@olha23
Copy link

olha23 commented Jun 8, 2023

@xoscar xoscar linked a pull request Jun 9, 2023 that will close this issue
4 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend design enhancement New feature or request frontend triage requires triage
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants