Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests: Make the OMB validator use result.json #16086

Merged
merged 1 commit into from
Jan 16, 2024

Conversation

StephanDollberg
Copy link
Member

The OMB validator was not validating p999 even though it was specified
in the validator requirements.

There was two reasons for that:

  • We were using the data from the output of the generate_charts
    command which is incomplete and doesn't return p999 e2e latency
  • This wasn't fatal because the validator logic silently failed to
    validate requested validation metrics if they were not in the metrics
    output

This patch fixes both by switching to actually using the result.json
file and further invert the validation loop such that it errors out if a
requested validation metric does not exist.

Backports Required

  • none - not a bug fix
  • none - this is a backport
  • none - issue does not exist in previous branches
  • none - papercut/not impactful enough to backport
  • v23.3.x
  • v23.2.x
  • v23.1.x

Release Notes

  • none

@andrewhsu
Copy link
Member

/cdt
rp_version=build
tests/rptest/tests/services_self_test.py::OpenBenchmarkSelfTest

@StephanDollberg
Copy link
Member Author

tests/rptest/tests/services_self_test.py::OpenBenchmarkSelfTest

Oh nice, didn't know we had those.

andrewhsu
andrewhsu previously approved these changes Jan 12, 2024
Copy link
Member

@andrewhsu andrewhsu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -317,7 +317,18 @@ def check_succeed(self, validate_metrics=True):
self.raise_on_bad_log_lines(node)
# Generate charts from the result
self.logger.info(f"Generating charts with command {self.chart_cmd}")
metrics = json.loads(self.node.account.ssh_output(self.chart_cmd))
self.node.account.ssh_output(self.chart_cmd)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So we still get the chart in the artifact, I guess?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep correct

self.node.account.ssh_output(
f'cat {OpenMessagingBenchmark.RESULT_FILE}'))

# Previously we were using generate_charts.py to get the metrics which
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Huh, maybe that's why it was being used...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes possibly, though I could also see a case where we were already using it and then wanted the new metric and just add it to generate_charts directly.


for key in validator.keys():
if key not in metrics:
assert False, f"Missing requested validator key {key} in metrics"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: this could just be assert key in metrics, ... ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Of course yes, fixed.

travisdowns
travisdowns previously approved these changes Jan 13, 2024
ballard26
ballard26 previously approved these changes Jan 13, 2024
The OMB validator was not validating p999 even though it was specified
in the validator requirements.

There was two reasons for that:
 - We were using the data from the output of the `generate_charts`
   command which is incomplete and doesn't return p999 e2e latency
 - This wasn't fatal because the validator logic silently failed to
   validate requested validation metrics if they were not in the metrics
   output

This patch fixes both by switching to actually using the `result.json`
file and further invert the validation loop such that it errors out if a
requested validation metric does not exist.
@vbotbuildovich
Copy link
Collaborator

vbotbuildovich commented Jan 15, 2024

@travisdowns
Copy link
Member

Failure is: #16026

@piyushredpanda piyushredpanda merged commit ab502f3 into dev Jan 16, 2024
14 of 17 checks passed
@piyushredpanda piyushredpanda deleted the stephan/omb-validator-fix branch January 16, 2024 10:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants