Skip to content

[preview] Enable ws-daemon cpu limiting #14441

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Nov 14, 2022
Merged

[preview] Enable ws-daemon cpu limiting #14441

merged 1 commit into from
Nov 14, 2022

Conversation

Furisto
Copy link
Member

@Furisto Furisto commented Nov 4, 2022

Description

Enable ws-daemon cpu limiting in preview environments so that it does not have to be manually activated every time when someone wants to work on cpu limiting

Related Issue(s)

n.a.

How to test

  1. generate a config like so: ./installer init > config.yaml
  2. generate a k8s manifest like so: ./installer render -n $(kubens -c) -c config.yaml > k8s.yaml
  3. fake a license and feature file like so: echo "foo" > /tmp/license && echo '"bar"' > /tmp/defaultFeatureFlags
  4. call this script like so: ./.werft/jobs/build/installer/post-process.sh 1234 5678 2 your-branch-just-dashes

Release Notes

None

Documentation

Werft options:

  • /werft with-local-preview
    If enabled this will build install/preview
  • /werft with-preview
  • /werft with-large-vm
  • /werft with-integration-tests=workspace
    Valid options are all, workspace, webapp, ide

@werft-gitpod-dev-com
Copy link

started the job as gitpod-build-enable-cpu-limiting.1 because the annotations in the pull request description changed
(with .werft/ from main)

@Furisto Furisto requested a review from a team November 4, 2022 15:44
@Furisto Furisto self-assigned this Nov 4, 2022
@Furisto Furisto added the team: workspace Issue belongs to the Workspace team label Nov 4, 2022
@Furisto
Copy link
Member Author

Furisto commented Nov 4, 2022

/werft run with-integration-tests=workspace with-clean-slate-deployment

👍 started the job as gitpod-build-enable-cpu-limiting.2
(with .werft/ from main)

@werft-gitpod-dev-com
Copy link

started the job as gitpod-build-enable-cpu-limiting.3 because the annotations in the pull request description changed
(with .werft/ from main)

| jq ".service.address = $WS_DAEMON_PORT" > /tmp/"$NAME"-"$KIND"-overrides.json
| jq ".service.address = $WS_DAEMON_PORT" \
| jq ".daemon.cpulimit.enabled = true" \
| jq ".daemon.cpulimit.totalBandwidth = 12" \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we have to consider the case of a large VM.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's fine. For the large VM the value is correct and if you set the actual total bandwidth for the small VM you are unlikely to get the burst. Which you would want for testing purposes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm afraid of that it affects the integration time. E.g. there is a k3s test case, and this workspace needs a lot of resources to start up k3s.
I am also concerned that these limitations may deplete resources and cause integration-agent to fail to launch. Can you run the test in a normal VM (not large) and see what happens?

@utam0k
Copy link
Contributor

utam0k commented Nov 6, 2022

Hi, @Furisto! Thanks for your great improvement. I just want to get the change of this PR. But the ci got failed in the integration test. May I ask you to stop merging this until this my PR #14260 is merged?

@Furisto Furisto force-pushed the enable-cpu-limiting branch from 1983a20 to 225f46e Compare November 14, 2022 13:46
@Furisto
Copy link
Member Author

Furisto commented Nov 14, 2022

@utam0k PTAL

@roboquat roboquat merged commit 7a0d737 into main Nov 14, 2022
@roboquat roboquat deleted the enable-cpu-limiting branch November 14, 2022 18:36
@roboquat roboquat added deployed: workspace Workspace team change is running in production deployed Change is completely running in production labels Dec 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
deployed: workspace Workspace team change is running in production deployed Change is completely running in production release-note-none size/XS team: workspace Issue belongs to the Workspace team
Projects
No open projects
Status: Done
Development

Successfully merging this pull request may close these issues.

4 participants