Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kube-vip fails to run with Kubernetes 1.29 #1792

Closed
Amulyam24 opened this issue May 23, 2024 · 1 comment · Fixed by #1798
Closed

kube-vip fails to run with Kubernetes 1.29 #1792

Amulyam24 opened this issue May 23, 2024 · 1 comment · Fixed by #1798
Assignees
Labels
area/provider/ibmcloud Issues or PRs related to ibmcloud provider kind/bug Categorizes issue or PR as related to a bug.
Milestone

Comments

@Amulyam24
Copy link
Contributor

/kind bug
/area provider/ibmcloud

What steps did you take and what happened:
With the current plan to move to k8s version 1.29 for PowerVS CI, testing showed that kubeadm init fails on the control plane node. The issue is with kube-vip failing with 1.29 with the following:

  1. hostAliases do not work as static pod manifests in Kubernetes v1.29 #692

  2. kube-vip requires super-admin.conf with Kubernetes 1.29 #684

What did you expect to happen:
Cluster to be created successfully with k8s v1.29

Anything else you would like to add:
[Miscellaneous information that will assist in solving the issue.]

Environment:

  • Cluster-api version:
  • Minikube/KIND version:
  • Kubernetes version: (use kubectl version):
  • OS (e.g. from /etc/os-release):
@k8s-ci-robot k8s-ci-robot added kind/bug Categorizes issue or PR as related to a bug. area/provider/ibmcloud Issues or PRs related to ibmcloud provider labels May 23, 2024
@Amulyam24
Copy link
Contributor Author

/assign

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/provider/ibmcloud Issues or PRs related to ibmcloud provider kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants