-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Charm doesn't react to pebble_ready so if the workload container restarts the charm stays in maintenance status #129
Comments
Thank you for opening this issue @mthaddon, we added this to our next pulse. |
I'm pretty sure this was fixed when we moved to using I'll promote the charm from edge to candidate. @mthaddon can you validate if you are still experiencing the issue with the candidate release?
|
I can't see a way to manually kill the workload container for this charm, as there's no shell for me to connect to the workload container, and pebble plan seems to be empty (I've just deployed the charm and set |
Indeed, there is no pebble plan, we are only using LEGO cli calls when needed. Here I simply deleted the k8s pod.
|
@gruyaume that deletes the entire pod. What I need to do here to confirm if the bug is fixed is to kill the lego container (which will then be restarted) but leave the charm container running. |
Just for a bit more context, here's the output of As you can see, the lego container has been restarted twice, and the last restart was on |
I'm quite confident this would have been fixed by switching to "collect status" as status will be evaluated at every event now. I'm not quite sure how to confirm this 100%. |
Describe the bug
The charm doesn't react to pebble_ready so if the workload container restarts the charm stays in maintenance status.
To Reproduce
Expected behavior
The charm should be active/idle.
Screenshots
N/A
Logs
N/A
Environment
juju --version
): 3.1.8kubectl version --short
): Client Version: v1.29.4Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: v1.26.15
Additional context
N/A
The text was updated successfully, but these errors were encountered: