Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tetragon e2e Test: Run e2e tests in parallel on multiple runners #2354

Merged
merged 1 commit into from
Apr 25, 2024
Merged

Tetragon e2e Test: Run e2e tests in parallel on multiple runners #2354

merged 1 commit into from
Apr 25, 2024

Conversation

Trung-DV
Copy link
Contributor

E2e tests currently have 3 packages that can run independently with each other

By split the test into 3 runners, we can reduce about ~50% CI time: https://github.com/Trung-DV/tetragon/actions/runs/8771741177

@Trung-DV Trung-DV requested review from willfindlay and a team as code owners April 21, 2024 09:53
@kkourt
Copy link
Contributor

kkourt commented Apr 21, 2024

Hi,

Thanks for the PR, but it's not clear to me that this will work as expected.

Each e2e test requires its own Tetragon instance, and having multiple Tetragon instances at the same time can be problematic. This is why we generally running the tests sequentially.

@Trung-DV
Copy link
Contributor Author

Hi,

Thanks for the PR, but it's not clear to me that this will work as expected.

Each e2e test requires its own Tetragon instance, and having multiple Tetragon instances at the same time can be problematic. This is why we generally running the tests sequentially.

Hi @kkourt
As I understand, currently, we run e2e test sequentially, we will init a Tetragon instance for each test package at a time.
So I just split out the runners, init a Tetragon instance for each runner, and each runner will a single test package.
All are independent environments (runners).
Is my approach reasonable?

@Trung-DV Trung-DV changed the title Tetragon e2e Test: Run e2e tests in parallel Tetragon e2e Test: Run e2e tests in parallel on multiple runners Apr 22, 2024
@mtardy mtardy added area/ci Related to CI release-note/ci This PR makes changes to the CI. labels Apr 23, 2024
Copy link
Member

@mtardy mtardy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good idea thanks! The only downside I could see is that your workflow might require lots of runners and sometimes it's a bottleneck when GitHub or actuated doesn't provide us with enough runners. But that should be fine.

Is there a way we could make this easier to see which is which?
image

Perhaps you could use jobs.<job_id>.name with the last folder name of the matrix path, or the entire path.

Like:

run-e2e-test:
  name: ${{ <your path> }}

Copy link

netlify bot commented Apr 23, 2024

Deploy Preview for tetragon ready!

Name Link
🔨 Latest commit 87ed230
🔍 Latest deploy log https://app.netlify.com/sites/tetragon/deploys/662798b8add0fc0008d07d84
😎 Deploy Preview https://deploy-preview-2354--tetragon.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@Trung-DV
Copy link
Contributor Author

I have updated
now it looks like https://github.com/Trung-DV/tetragon/actions/runs/8799449115

image

Copy link
Member

@mtardy mtardy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks I like it, I think it's a great improvement!

.github/workflows/run-e2e-tests.yaml Outdated Show resolved Hide resolved
.github/workflows/run-e2e-tests.yaml Outdated Show resolved Hide resolved
@mtardy
Copy link
Member

mtardy commented Apr 23, 2024

In addition to my previous comments, if you can squash your commits maybe! Thanks!

@mtardy
Copy link
Member

mtardy commented Apr 23, 2024

Please @willfindlay take a look and feel free to merge this!

Copy link
Contributor

@willfindlay willfindlay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks amazing, I just have one nitpick.

.github/workflows/run-e2e-tests.yaml Outdated Show resolved Hide resolved
Signed-off-by: Trung-DV <TrungDV.PMB@gmail.com>
@mtardy
Copy link
Member

mtardy commented Apr 25, 2024

Thanks!!

@mtardy mtardy merged commit b8edb7e into cilium:main Apr 25, 2024
39 checks passed
@Trung-DV Trung-DV deleted the pr/Trung-DV/reduce-e2e-test-duration branch April 26, 2024 06:10
@alexellis
Copy link

How many more jobs would you expect this to generate @mtardy?

Anything that could generate dozens or hundreds of jobs will probably be misusing "the commons" resource, if it's a couple more that is unlikely to make much difference.

@mtardy
Copy link
Member

mtardy commented May 16, 2024

How many more jobs would you expect this to generate @mtardy?

Anything that could generate dozens or hundreds of jobs will probably be misusing "the commons" resource, if it's a couple more that is unlikely to make much difference.

You are everywhere 👀 , it just transforms a 20min job into three jobs that are running significantly faster (something around 10min each) so we consume a bit more generally but it's also faster in a way. It should be reasonable as it asks for 4 CPUs and 16GB. We have been seeing bottlenecks on some occasions on the providing on actuated side, we hope our consumption is reasonable otherwise, we can find solutions.

@alexellis
Copy link

We've had a couple of users starting matrix builds with 250+ jobs .. this of course causes some issues for the shared service.

What you're doing sounds fine.

You can right-size VMs by running vmmeter - https://actuated.dev/blog/right-sizing-vms-github-actions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/ci Related to CI release-note/ci This PR makes changes to the CI.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants