-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
KPI Request: Test Runtime or time to Test #117
Comments
@oindrillac where is
We can see in the Prow data in the image bellow that for each test run there is a runtime associated with it. The question is: is this info present in the testgrid data? Or do we have to step into the Prow data to get it (something I would like to avoid for this current round of work). There are a number of meta-date fields for each test run that we have not yet fully explored, and perhaps the runtime value can be found here somewhere |
@MichaelClifford When we run this cell, we see that one of the columns in the resulting data frame is |
@oindrillac @hemajv not sure where you are at on this issue. But here is a quick answer to what is meant by If you go to: Click "Show Alerts" on any of the grid names, it will display some highlighted tests and includes a field for "First Failed" and "Last Passed" I'm pretty sure these are the values displayed in Since these values are only for the the "alert" tests I don't think it is sufficient to make a claim about the time to test or the total runtime for each test for the whole platform. Where you able to find the test runtime any other way? |
I think, fairly simple way to do this without looking into the Prow data, would be to leverage the inbuilt Graph within TestGrid for each Test in the job. These Graphs which can be toggled on and off by going to Graph>test-duration-metrics capture the time elapsed while running each test. The time elapsed for the test(time-duration-minutes) is plotted over time and captures values such as To access the json for this, we can get it from https://testgrid.k8s.io/redhat-openshift-ocp-release-4.2-informing/table?%20\%20&show-stale-tests=&tab=release-openshift-origin-installer-e2e-aws-upgrade-rollback-4.1-to-4.2&graph-metrics=test-duration-minutes which contains the This should be fairly straightforward to plot in a notebook. |
I agree, by extracting the |
@oindrillac good catch on the |
As an OpenShift product manager, I would like to see the time to test or the total runtime for each test as it would help filter and observe tests with longer than expected runtimes.
By measuring this metric, we can observe the trend of test runtimes and check if the execution time of the tests exceed the specified values. If the execution of the test suite takes a long time, we may wish to optimize our test code or track down tests that are taking too long. This metric can also be used to draw a correlation to the tests turning out to be flaky.
How to collect metric
In the
TestGrid_EDA
notebook, each test is associated with alast_run_timestamp
and each run hasfail_timestamp
as well aspass_timestamp
.Is there some available documentation which explains the meaning of those labels? More specifically
last_run_timestamp
mean the timestamp of the final run?Acceptance Criteria
notebooks/data-sources/TestGrid/metrics/
that collects this metric and stores it in ceph as a parquet.cc: @hemajv @MichaelClifford
The text was updated successfully, but these errors were encountered: