Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Acquiring lock at startup is taking long #4246

Closed
squakez opened this issue Apr 12, 2023 · 1 comment · Fixed by #4280
Closed

Acquiring lock at startup is taking long #4246

squakez opened this issue Apr 12, 2023 · 1 comment · Fixed by #4280
Assignees
Labels
kind/bug Something isn't working

Comments

@squakez
Copy link
Contributor

squakez commented Apr 12, 2023

I recall the lock acquiring used to take long some release ago. Now, it seems again it takes like 15 seconds, which I think it's longer than it should:

camel-k-operator-6844754cc8-98bw7 camel-k-operator {"level":"info","ts":1681297670.2073886,"logger":"camel-k.cmd","msg":"attempting to acquire leader lease default/camel-k-lock...\n"}
camel-k-operator-6844754cc8-98bw7 camel-k-operator {"level":"info","ts":1681297685.6172009,"logger":"camel-k.cmd","msg":"successfully acquired lease default/camel-k-lock\n"}

We need to put an eye on it.

@squakez squakez added the kind/bug Something isn't working label Apr 13, 2023
@squakez squakez self-assigned this Apr 13, 2023
@squakez
Copy link
Contributor Author

squakez commented Apr 13, 2023

It seems this is because we're failing to release the lease, so it waits the canonical 15 seconds to make sure the previous master is not alive, and it happens when a previous Lease was present on the namespace. When we shutdown the operator, we can see we have some error:

camel-k-operator-6844754cc8-kj76g camel-k-operator {"level":"info","ts":1681395538.729762,"msg":"Wait completed, proceeding to shutdown the manager"}
camel-k-operator-6844754cc8-kj76g camel-k-operator {"level":"error","ts":1681395538.7311943,"logger":"camel-k.cmd","msg":"Failed to release lock: leases.coordination.k8s.io \"camel-k-lock\" is forbidden: User \"system:serviceaccount:default:camel-k-operator\" cannot update resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"default\"\n","stacktrace":"k8s.io/klog/v2.(*loggingT).output\n\tk8s.io/klog/v2@v2.80.0/klog.go:877\nk8s.io/klog/v2.(*loggingT).printfDepth\n\tk8s.io/klog/v2@v2.80.0/klog.go:739\nk8s.io/klog/v2.(*loggingT).printf\n\tk8s.io/klog/v2@v2.80.0/klog.go:721\nk8s.io/klog/v2.Errorf\n\tk8s.io/klog/v2@v2.80.0/klog.go:1555\nk8s.io/client-go/tools/leaderelection.(*LeaderElector).release\n\tk8s.io/client-go@v0.25.6/tools/leaderelection/leaderelection.go:306\nk8s.io/client-go/tools/leaderelection.(*LeaderElector).renew\n\tk8s.io/client-go@v0.25.6/tools/leaderelection/leaderelection.go:289\nk8s.io/client-go/tools/leaderelection.(*LeaderElector).Run\n\tk8s.io/client-go@v0.25.6/tools/leaderelection/leaderelection.go:212\nsigs.k8s.io/controller-runtime/pkg/manager.(*controllerManager).startLeaderElection.func3\n\tsigs.k8s.io/controller-runtime@v0.13.1/pkg/manager/internal.go:647"}
camel-k-operator-6844754cc8-kj76g camel-k-operator {"level":"error","ts":1681395538.7313108,"msg":"error received after stop sequence was engaged","error":"leader election lost","stacktrace":"sigs.k8s.io/controller-runtime/pkg/manager.(*controllerManager).engageStopProcedure.func1\n\tsigs.k8s.io/controller-runtime@v0.13.1/pkg/manager/internal.go:545"}

We need to fix this one and make sure the Lease is released when the manager stops.

squakez added a commit to squakez/camel-k that referenced this issue Apr 14, 2023
squakez added a commit to squakez/camel-k that referenced this issue Apr 26, 2023
squakez added a commit to squakez/camel-k that referenced this issue Apr 26, 2023
squakez added a commit to squakez/camel-k that referenced this issue Apr 26, 2023
squakez added a commit that referenced this issue Apr 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant