Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[EKS] [request]: Kubernetes 1.13 Patch Update (to 1.13.8) #411

Closed
schahal opened this issue Jul 11, 2019 · 10 comments
Closed

[EKS] [request]: Kubernetes 1.13 Patch Update (to 1.13.8) #411

schahal opened this issue Jul 11, 2019 · 10 comments
Labels
EKS Amazon Elastic Kubernetes Service

Comments

@schahal
Copy link

schahal commented Jul 11, 2019

Tell us about your request
Thanks for supporting kubernetes v1.13 on EKS!

This issue is to see if EKS can bump the Kubernetes patch version from v1.13.7 to v1.13.8.

Which service(s) is this request for?
EKS - both the control plane and eks-optimized AMIs

Tell us about the problem you're trying to solve. What are you trying to do, and why is it hard?
Kubernetes 1.13.8 has an important fix (it fixes a regression that was introduced in v1.13) - especially pertaining to EKS' use of Amazon's VPC CNI plugin: when IP addresses are in the cooling period (30 seconds after pod termination when those IPs can't be used), this patch fixes the regression where kubelet no longer retries the sandbox creation on failure (regardless of the restart policy of the pod).

Are you currently working around this issue?
Until this is live, we have to revert to using EKS' Kubernetes 1.12 (which is less than ideal). Thanks for any updates here!

@alfredkrohmer
Copy link

As far as I can tell, this only affects the kubelet, right? So it would be sufficient to use version 1.13.8 of the kubelet to get the fix.

@schahal
Copy link
Author

schahal commented Jul 15, 2019

Indeed, possibly similar to what EKS did with 1.11 I believe - their control plane is still on v1.11.8 while the worker nodes are on v1.11.9.

@schahal schahal changed the title [EKS] Kubernetes 1.13 Patch Update (to 1.13.8) [EKS] [request]: Kubernetes 1.13 Patch Update (to 1.13.8) Jul 16, 2019
@zanitete
Copy link

zanitete commented Jul 17, 2019

I confirm that rebuilding the ami using kubelet v1.13.8 fixed the issue.

@alfredkrohmer
Copy link

@mogren Now that the control plane is on 1.13.8 since last weekend, would you please be able to release a new AMI with 1.13.8?

@pawelprazak
Copy link

FYI: the 1.13.8 AMI suddenly changed docker group id from 995 to 994, so you need to adjust your build scripts and Dockerfiles accordingly.

Tail of the /etc/group:

cgred:x:995:
docker:x:994:ec2-user

@M00nF1sh
Copy link

M00nF1sh commented Aug 22, 2019

@pawelprazak
Hi, you shouldn't rely the group id of specific user. (Is there an reason that u have to rely on that?)
BTW, the groupId is changed due to awslabs/amazon-eks-ami@41f4dd9, which occupied a groupID :D

@pawelprazak
Copy link

Yes, not relying on the docker GID would be best, but unfortunately that was the only way we found to enable building container images with docker command from inside the container (pod) that is run as non-root user.

@cdenneen
Copy link

Looks like they need to get patches out for control-plane and worker images for security updates

https://www.bleepingcomputer.com/news/security/severe-flaws-in-kubernetes-expose-all-servers-to-dos-attacks/

1.13.10 in this case

@andreamaruccia
Copy link

@aws would it be possible for you guys to publish eks node ami's as soon as patches come out? In case of 1.13.8 our autoscaling is not working as expected and in 1.13.10 it is clear that dos attacks can be a threat. Seems that we've to wait an additional month or two, which imho beats the purpose of patches.

@tabern
Copy link
Contributor

tabern commented Nov 15, 2020

Closing this as EKS no longer supports Kubernetes v1.13.

@tabern tabern closed this as completed Nov 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
EKS Amazon Elastic Kubernetes Service
Projects
None yet
Development

No branches or pull requests

9 participants