-
Notifications
You must be signed in to change notification settings - Fork 38.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update the dynamic volume limit in GCE PD #77311
Update the dynamic volume limit in GCE PD #77311
Conversation
Currently GCE PD support 128 maximum disks attached to a node for all machines types except shared-core. This PR updates the limit number to date. Change-Id: Id9dfdbd24763b6b4138935842c246b1803838b78
/kind bug /lgtm |
/lgtm |
Assigning patch release managers: |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: jingxu97, saad-ali The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
/hold |
@tpepper The GCE platform changed the number of disks that are supported per node. These PRs are updating those limits on the k8s side. The change is on the edge between a bug and a feature. But since we've gotten customer complaints about it (it appears to customers like a bug), we'd like to cherry pick it. |
Since this is small, provider specific, and SIG approved, will merge. |
/hold cancel |
Currently GCE PD support 128 maximum disks attached to a node for all
machines types except shared-core. This PR updates the limit number to
date.
Change-Id: Id9dfdbd24763b6b4138935842c246b1803838b78