Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for ordered test suite?(E.g. Smoke test suite is first run, then test suite for each components, finally performance test suite) #1278

Open
pbxqdown opened this issue Sep 26, 2023 · 22 comments

Comments

@pbxqdown
Copy link

If previous test suite doesn't pass, then we can skip following tests to save time(E.g. if smoke test fails, it is not necessary to run performance test).

@pbxqdown pbxqdown changed the title Support for ordered test suite?(E.g. Smoke test suite is first run, then test suite for each components, finally performance test suite)) Support for ordered test suite?(E.g. Smoke test suite is first run, then test suite for each components, finally performance test suite) Sep 26, 2023
@onsi
Copy link
Owner

onsi commented Sep 26, 2023

what you’re describing is the default behavior. if you run:

ginkgo smoke component-1 component-2

it will stop after the first suite failes

@pbxqdown
Copy link
Author

@onsi this is how we run test: ginkgo -v --mod=vendor ./tests. can we specify test suite name directly in ginkgo command?
Also wondering what if there are many suites, we may only want some test(e.g. smoke) to firstly run.

@onsi
Copy link
Owner

onsi commented Sep 26, 2023

that looks like you're just running one test suite. how many suites do you have and how have you organized them?

@onsi
Copy link
Owner

onsi commented Sep 26, 2023

btw this section of the docs:

https://onsi.github.io/ginkgo/#mental-model-go-modules-packages-and-tests

describes how in go and ginkgo each suite is a separate package

@pbxqdown
Copy link
Author

pbxqdown commented Sep 28, 2023

@onsi Thanks for the doc. we have many test suites inside ./test directory. is it possible that smoke test suite is first run, then rest of test suites. We would like to have single test run so there is only one test report generated.

@onsi
Copy link
Owner

onsi commented Sep 28, 2023

The reason I was asking is because if you have

tests/A
tests/B
tests/C
tests/smoke

where A, B, C, and smoke are directories then ginkgo -v --mod=vendor ./tests won't run anything. You would need to run ginkgo -r -v -mod=vendor ./tests

in any event, assuming your suites are organized into different directories then you have two options:

  1. Move smoke into a different top-level directory:
smoke
tests/A
tests/B
tests/C

then you can run ginkgo -r -v -mod=vendor ./smoke ./tests if smoke fails then all the other tests will be skipped.

  1. Leave smoke in tests but run it first, then run all the suites in tests (which will re-run smoke):
tests/A
tests/B
tests/C
tests/smoke

followed by ginkgo -r -v -mod=vendor ./tests/smoke ./tests. Now if smoke fails then the other tests will be skipped, however if it passes then it will run again when tests are run.


If your test suites are not in separate directories like this (which sometimes happens where users define tests in different directories but then combine them into a single suite by importing the directories) then neither of these approaches will work.

@pbxqdown
Copy link
Author

pbxqdown commented Oct 3, 2023

Our test directory is flat(e.g. A.test, B.test, smoke.test are all files not directory). Can we still have order of tests? I'm think moving smoke test to a directory so they will first run. But still it would not be perfect since smoke test is run twice.

@onsi
Copy link
Owner

onsi commented Oct 3, 2023

got it - so what you have is a single suite and you are wanting to run different subsets of the suite with a single invocation of ginkgo. that is not possible currently.

i typically run a check in my BeforeSuite to ensure the suite should run. It sounds like you want to have multiple specs constitute that check and that too is not possible currently. if you generate a separate suite called smoke in its own folder then run ginkgo smoke tests i bet you get what you want with no repeats.

@pbxqdown
Copy link
Author

pbxqdown commented Oct 4, 2023

Thanks @onsi. We run a single test suite with different types of tests specified by test labels. I think the best practice is to split into multiple test suites? do we still have only one test result file so only one test report is generated?

Also a side question, wondering what is the difference between ''--flakeattempts" and "--repeat". We would like to use this flag to cure some test flakes.

@onsi
Copy link
Owner

onsi commented Oct 4, 2023

yes if you run a single invocation of ginkgo then ginkgo will concatenate the results into a single file unless you use --keep-separate-reports. I suggest reading this part of the docs and trying this with a simple setup - e.g. make a folder and use ginkgo bootstrap and ginkgo generate to make some test files and test suites add a few Its, then play around with the various options and invocations of Ginkgo.

As for whether it is better to split up a single test suite into multiple suites, or to use test labels... I think that really depends on what works better for you. The only caveat is if you want to run different sets of tests with different label filters that would be different invocations of ginkgo which will each generate a report. The report is just a json or junit xml file, though, which can be merged after the fact without too much effort.

Also a side question, wondering what is the difference between ''--flakeattempts" and "--repeat". We would like to use this flag to cure some test flakes.

--flakeattempts=N and the FlakeAttempts(N) decorator both tell Ginkgo to rerun a failing test up to N times. If it passes once the test passes. This is a way of ignoring flakes and doesn't actually "cure" them. Docs here

--repeat=N and MustPassRepeatedly(N) are sort of the opposite. They tell Ginkgo to rerun the entire suite (with --repeat) or a specific spec (with MustPassRepeatedly) N times and require the suite/spec to pass every time. If you want to actually "cure" test flakes then I recommend using MustPassRepeatedly to rerun specs that are flakey so you can reproduce the flake and figure out how to actually fix them. Docs here

@pbxqdown
Copy link
Author

pbxqdown commented Oct 9, 2023

thanks, this is very helpful! wondering whether test report contains flake info. Currently we are generating a pdf report, but only pass/fail/skip for a test case. There is no "flake" state for a test case.

@onsi
Copy link
Owner

onsi commented Oct 9, 2023

how are you generating the pdf report?

@pbxqdown
Copy link
Author

pbxqdown commented Oct 9, 2023

we are using junit2html package of python

@onsi
Copy link
Owner

onsi commented Oct 9, 2023

ginkgo’s json format includes the number of attempts for each spec:

NumAttempts int

the junit format does not support adding structured information for each spec so the number of attempts must be inferred from the output of the spec.

@pbxqdown
Copy link
Author

pbxqdown commented Oct 9, 2023

does it mean junit2html can not be used to generate report with flake info? Please recommend a pdf-generating tool that works best with ginkgo test output,

@onsi
Copy link
Owner

onsi commented Oct 9, 2023

i have no recommendation for you, sorry. this is the first time i’ve seen a request for a pdf version of the report. the junit format does not support adding additional data. have you inspected the output of a flakey spec to see what does get emitted to the pdf? you should be getting a lot of detail from the test run which will include each attempt

note that a flakey spec can ultimately pass or fail - which is why “flake” is not a valid end state but rather an annotation on the spec.

generating and html version of the json report would not be hard - though making it look pretty could be an undertaking.

@pbxqdown
Copy link
Author

pbxqdown commented Oct 9, 2023

got it. the reason we use junit is because it is a industry standard so we can find tools to generate report from it. does json output comply with a standard such that it can be visualized and read easily? Here is a sample output of pdf report from junit2html: image

@pbxqdown
Copy link
Author

pbxqdown commented Oct 9, 2023

looks like indeed we can see the flake info from junit report's log. There is no impressive indicator on top of report though.

@onsi
Copy link
Owner

onsi commented Oct 10, 2023

Correct - again, the junit format does not support such things. Perhaps you can step back and share the problem you are trying to solve so I can help. Is your goal to identify flaky tests and fix them? Is your goal to track whether a test suite is becoming more flakey over time? Is your goal to understand whether some environments are flakier than others?

@pbxqdown
Copy link
Author

yup, basically we would like to run the test suite in a zone to ensure our components are healthy. But a zone's health is impacted by many factors(host infra, network, etc), they can all cause test to fail some time. We would like to add flake attempt to rule out environmental issues.
It looks like flake_attempt flag looks quite good, we can see how many tests are flaked in test log. We send test summary as a pdf test report to entire team through email. It would be even better the summary report contains flake info so we save energy looking at longer logs :)

@onsi
Copy link
Owner

onsi commented Oct 31, 2023

In case there's interest, please take a look at the new proposal for managing shared resources and having finer-grained control over parallelism here: #1292

@pbxqdown
Copy link
Author

pbxqdown commented Nov 3, 2023

Thanks! this proposal would be very helpful in k8s related test scenarios.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants