Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kubelet log flooded by SyncLoop on nginx-ingress-controller #30052

Closed
Leen15 opened this issue Nov 12, 2020 · 2 comments
Closed

Kubelet log flooded by SyncLoop on nginx-ingress-controller #30052

Leen15 opened this issue Nov 12, 2020 · 2 comments

Comments

@Leen15
Copy link

Leen15 commented Nov 12, 2020

Hi, I just upgraded rancher from 2.4.8 to 2.5.2.
The rancher installation is on docker, and the same node was the controlplane of a cluster of 2 nodes (1 controlplane, 1 worker), version 1.18.10 deployed by Rancher UI.
After the upgrade, the load of the node has became out of control, so I promoted the worker to controlplane, and tried to convert the first node to worker.
After multiple times, I'm still not able to have that node as a worker (I receive multiple errors about ephemeral storage, I don't know why).
So I removed it and I kept the cluster with only 1 node.
Now, in this node I have kubelet log flooded by these logs:

I1112 15:46:41.012218    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.029633    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.046936    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.067353    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.087268    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.106162    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.131468    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.147064    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.168857    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.180464    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.263065    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.263308    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.263412    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.266875    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.276597    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.286090    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.305156    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.315786    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.325132    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.334698    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"
I1112 15:46:41.348565    2795 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-qcmfj_ingress-nginx(4e192151-4cd5-452b-9da1-8487645b8329)"

At the same time, both kubelet and api-server use LOTS of resources of the node.

Any idea why this thing is happening and how to resolve?

The cluster is working, and nginx ingress too.

Thanks

@Leen15
Copy link
Author

Leen15 commented Nov 13, 2020

The issue seems to be related to #30045.
As suggested by @superseb on slack, I disabled the project network isolation and the log is gone. Also node load is now normal.

@Leen15 Leen15 closed this as completed Nov 13, 2020
@sowmyav27
Copy link
Contributor

sowmyav27 commented Nov 13, 2020

Verified an upgrade from 2.4.8 to 2.5.2

  • On 2.4.8 rancher
  • Deploy a custom cluster - Project Network Isolation enabled.
  • Enable Monitoring in this cluster
  • Upgrade to 2.5.2 rancher.
  • Kubelet logs on worker node:
I1113 17:21:33.933579   10578 kubelet.go:1917] SyncLoop (UPDATE, "api"): "exporter-node-cluster-monitoring-vwpb7_cattle-prometheus(a67eaa77-5cdd-4683-8e16-17fdda1f43e6)"
I1113 17:21:33.950426   10578 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-28k4x_ingress-nginx(ebde362b-fa65-45c3-9f65-1aae0d1a589c)"
I1113 17:21:33.994919   10578 kubelet.go:1917] SyncLoop (UPDATE, "api"): "exporter-node-cluster-monitoring-vwpb7_cattle-prometheus(a67eaa77-5cdd-4683-8e16-17fdda1f43e6)"
I1113 17:21:34.011517   10578 kubelet.go:1917] SyncLoop (UPDATE, "api"): "nginx-ingress-controller-28k4x_ingress-nginx(ebde362b-fa65-45c3-9f65-1aae0d1a589c)"
  • Monitoring metrics- considerable increase in CPU utilization after an upgrade. Upgrade was done around 9.20 am

Screen Shot 2020-11-13 at 9 48 36 AM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants