Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(podtemplate): set GOMEMLIMIT, GOMAXPROCS when limits are defined #1029

Merged
merged 2 commits into from
Apr 9, 2024

Conversation

ChandonPierre
Copy link
Contributor

@ChandonPierre ChandonPierre commented Apr 6, 2024

What does this PR do?

Always define GOMEMLIMIT and GOMAXPROCS env vars when memory limits and CPU limits are defined.

Motivation

Traefik is written in GoLang, and is thus subject to CPU throttling when CPU limits are defined in a container without GOMAXPROCS set. We see this reflected in the container_cpu_cfs_throttled_seconds_total metric in our environment; where CPU throttle rate exceeds 1000%

More

  • Yes, I updated the tests accordingly
  • Yes, I ran make test and all the tests passed

@mloiseleur
Copy link
Contributor

\o. that's right @ChandonPierre !

It's confirmed with other blog post and the existence of automaxprocs.

Copy link
Contributor

@mloiseleur mloiseleur left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this PR 👍
Would you please ensure within a test to ensure there is no GOMAXPROCESS & GOMEMLIMIT when resources are not set (since it's the default behavior) ?
Your PR is clearly not doing that : I'm suggesting to add that, just in case, as a safety net.

@ChandonPierre
Copy link
Contributor Author

Thanks for this PR 👍 Would you please ensure within a test to ensure there is no GOMAXPROCESS & GOMEMLIMIT when resources are not set (since it's the default behavior) ? Your PR is clearly not doing that : I'm suggesting to add that, just in case, as a safety net.

Certainly reasonable! Added tests for undefined limits case in 5bd8be3

Copy link
Contributor

@darkweaver87 darkweaver87 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@traefiker traefiker merged commit e4f2aa8 into traefik:master Apr 9, 2024
1 check passed
@ChandonPierre ChandonPierre deleted the cpierre/GOMAX branch April 9, 2024 13:11
alemorcuq pushed a commit to bitnami-labs/sealed-secrets that referenced this pull request May 23, 2024
**Description of the change**

Set `GOMAXPROCS` and `GOMEMLIMIT` environment variables based on
container resources.

Inspired by traefik/traefik-helm-chart#1029.

**Benefits**

This should reduce potential CPU throttling and OOMKills on containers.

**Possible drawbacks**

This creates an empty `env` key for those not setting resource values.
This is only a little ugly, but should not be harmful. Alternatively, we
could add some conditional wrapper around the whole `env` block to only
make it appear if a value is set, but that will be more complicated if
additional env would be added in the future.

**Additional information**

The
[`resourceFieldRef`](https://kubernetes.io/docs/concepts/workloads/pods/downward-api/#downwardapi-resourceFieldRef)
is a very specific Kubernetes directive that is created specifically for
passing resource-related values, which rounds up the CPU value to the
nearest whole number (e.g. 250m to 1) and passes the memory as a numeric
value; so `64Mi` would result in the environment variable being set to
`67108864`. This by design makes it completely compatible with Go's API.

An example is documented within Kubernetes documentation itself:
https://kubernetes.io/docs/tasks/inject-data-application/environment-variable-expose-pod-information/#use-container-fields-as-values-for-environment-variables.

---------

Signed-off-by: Jesper Noordsij <jesper.noordsij@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants