Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generation of automatic result (pass/warning/fail) based on defined performance acceptance criteria #5321

Open
asfimport opened this issue May 20, 2020 · 8 comments

Comments

@asfimport
Copy link
Collaborator

piotr.mirek (Bug 64457):

TEST RESULT (PASS/WARNING/FAIL) EVALUATION

Currently Dashboard provides very nice report, though I find it lacks important features to get clear result if test is passed or not.

There should be possibility to set this based on performance SLAs/requirements by user to evaluate this easily. Some examples:

  • test is considered failed if 'http_sampler_login' 95th percentile > 300ms
  • test is considered failed if 'oracle_sampler_commit_new_user' avg > 500ms
  • test is considered failed if 'soap_set_options' throughput TPS < 20
  • test raise warning if 'rest_get_owner' error % >1 & <5
  • test is considered failed if 'rest_get_owner' error % > 5

Definition of failed condition in this case is a must, and warning is beneficial to rise attention that problem may be close.

TEST RESULT DEFINITION

  • no errors & no warnings => test result is PASSED
  • 1 or more warnings => test result is WARN
  • 1 or more errors => test result is FAILED

This should reflect Dashboard statistics with relevant colors (please see attachment) via using CSS classes for errors/warnings/failures

PAC CONFIG ELEMENT

To get this, there should be possibility to have config element - let's call it for a now PAC - performance acceptance criteria, that can be added to each sampler.

PAC should have possibility to optional set ranges for pass/warning/fail criteria for each value that can be seed in the dashboard main statistics, e.g.

Executions

  • #Samples
  • Error %
  • KO

Response Times (ms)

  • Average
  • Min
  • Max
  • 90th pct
  • 95th pct
  • 99th pct

Throughput

  • Transactions/s

Network (KS/sec)

  • Received
  • Sent

Please see PAC element prototype for better overview in attachments

TEST DASHBOARD/RESULTS ELEMENT

It should be possible to add test dashboard as element in test plan, for having it visualized together with PAC config elements in same plan (attachments).

If possible, dashboard element should update new PAC config elements data on-the-fly. (please see the attachment)

Evaluation result should be also provided in JSON format, as a file that can be easily integrated in CI/CD pipelines (parsing json is "cheaper" than XML and .json files can be easily send via curl to many endpoints)

Please see test plan example in the attachements

OS: All

Duplicates:

@asfimport
Copy link
Collaborator Author

piotr.mirek (migrated from Bugzilla):
Created attachment prototypes.png: prototypes of requested features

prototypes of requested features

@asfimport
Copy link
Collaborator Author

piotr.mirek (migrated from Bugzilla):
Created attachment dashboard.with.acceptance.criteria.png: dashboard with acceptance criteria

dashboard with acceptance criteria

@asfimport
Copy link
Collaborator Author

piotr.mirek (migrated from Bugzilla):
Created attachment dashboard-report.element.prototype.png: dashboard element prototype

dashboard element prototype

@asfimport
Copy link
Collaborator Author

piotr.mirek (migrated from Bugzilla):
Created attachment PAC.config.element.prototype.png: PAC config element prototype

PAC config element prototype

@asfimport
Copy link
Collaborator Author

piotr.mirek (migrated from Bugzilla):
Created attachment test.plan.example.png: test plan example

test plan example

@asfimport
Copy link
Collaborator Author

piotr.mirek (migrated from Bugzilla):
In the attachments there are graphical samples or prototypes how the requested features could look like.

Current JMeter dashboard report is informative only, requested changes are to make it more like decision maker and use it more easily within CI/CD pipelines and cloud integrations.

@asfimport
Copy link
Collaborator Author

piotr.mirek (migrated from Bugzilla):
Created attachment dashboard-report.element.prototype.png: dashboard element prototype

dashboard element prototype

@asfimport
Copy link
Collaborator Author

piotr.mirek (migrated from Bugzilla):
If possible, please try to implement same policy as for other JMeter components, that enables scope under specific hash tree.

E.g. PAC config element assigned for thread group (or simple controller, etc):

  • applies for all samplers under the thread group
  • can be overwritten by assigning PAC config element for specific sampler

In this way you will reduce assigning multiple PACs with same requirements for whole group.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant