New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Thread/connection leak. #177

Merged
merged 1 commit into from Jul 29, 2017

Conversation

Projects
None yet
4 participants
@iocanel

iocanel commented Jul 27, 2017

No description provided.

@carlossg carlossg merged commit 4cebb12 into jenkinsci:master Jul 29, 2017

@iocanel

This comment has been minimized.

Show comment
Hide comment
@iocanel

iocanel Jul 29, 2017

@carlossg: FWIW, this leak was introduced after upgrading to kubernetes-client 2.3.1. This version contains: iocanel/kubernetes-client@bbe2cee which can potentially address some of the pipe related issues we see reported on JIRA.

I think that was should release soon.

EDIT: Just realized that you just did :-)

iocanel commented Jul 29, 2017

@carlossg: FWIW, this leak was introduced after upgrading to kubernetes-client 2.3.1. This version contains: iocanel/kubernetes-client@bbe2cee which can potentially address some of the pipe related issues we see reported on JIRA.

I think that was should release soon.

EDIT: Just realized that you just did :-)

@marvinthepa

This comment has been minimized.

Show comment
Hide comment
@marvinthepa

marvinthepa Jul 31, 2017

This (among other calls of close), triggers this warning all the time, as the close is also called from e.g. closeQuietly.

Just start a single job and watch the global Jenkins logs for that message...

Edit:
also called from https://github.com/fabric8io/kubernetes-client/blob/v2.3.1/kubernetes-client/src/main/java/io/fabric8/kubernetes/client/dsl/internal/ExecWebSocketListener.java#L114

This (among other calls of close), triggers this warning all the time, as the close is also called from e.g. closeQuietly.

Just start a single job and watch the global Jenkins logs for that message...

Edit:
also called from https://github.com/fabric8io/kubernetes-client/blob/v2.3.1/kubernetes-client/src/main/java/io/fabric8/kubernetes/client/dsl/internal/ExecWebSocketListener.java#L114

This comment has been minimized.

Show comment
Hide comment
@iocanel

iocanel Jul 31, 2017

You are right! But I think that this is something that we need to address in the kubernetes-client

The only thing we can do here, is hide the warning, until we get a fixed version of the client.

iocanel replied Jul 31, 2017

You are right! But I think that this is something that we need to address in the kubernetes-client

The only thing we can do here, is hide the warning, until we get a fixed version of the client.

This comment has been minimized.

Show comment
Hide comment
@marvinthepa

marvinthepa Jul 31, 2017

What will you fix in the client? Not closing the listener?

marvinthepa replied Jul 31, 2017

What will you fix in the client? Not closing the listener?

This comment has been minimized.

Show comment
Hide comment
@iocanel

iocanel Jul 31, 2017

There are two options:

i) on call listener.onClose() when we actually cleanup the websocket and stream pumpers.
ii) split onClose() to onClosing() and onClosed().

Option (i) could be as simple as removing: https://github.com/fabric8io/kubernetes-client/blob/v2.3.1/kubernetes-client/src/main/java/io/fabric8/kubernetes/client/dsl/internal/ExecWebSocketListener.java#L114. But I am not sure if there are any caveats to it.

Option (ii) is more like aligning the ExecListener with the okhttp3.WebsocketListener.

It needs some though and and experimenting.

iocanel replied Jul 31, 2017

There are two options:

i) on call listener.onClose() when we actually cleanup the websocket and stream pumpers.
ii) split onClose() to onClosing() and onClosed().

Option (i) could be as simple as removing: https://github.com/fabric8io/kubernetes-client/blob/v2.3.1/kubernetes-client/src/main/java/io/fabric8/kubernetes/client/dsl/internal/ExecWebSocketListener.java#L114. But I am not sure if there are any caveats to it.

Option (ii) is more like aligning the ExecListener with the okhttp3.WebsocketListener.

It needs some though and and experimenting.

@chenfli

This comment has been minimized.

Show comment
Hide comment
@chenfli

chenfli Jul 31, 2017

I am having similar behavior. I am using helm with the kubernetes-plugin 0.12. I'll post a bug on the jenkins bug tracker, and update here accordingly

https://issues.jenkins-ci.org/browse/JENKINS-45885

/home/jenkins # printf "EXITCODE %3d" $?; exit Jul 31, 2017 1:45:28 PM org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1$1 onClose SEVERE: onClose called but latch already finished. This indicates a bug in the kubernetes-plugin Jul 31, 2017 1:45:28 PM org.jenkinsci.plugins.durabletask.ProcessLiveness isAlive WARNING: org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@63f09283; decorates hudson.Launcher$RemoteLauncher@58a8b185 on hudson.remoting.Channel@428191f0:Channel to /10.2.13.41 does not seem able to determine whether processes are alive or not ^[[45;38REXITCODE 0/home/jenkins/workspace/marketing-site_chen2-master-JZLCNMYKV67ZVLSGU6NZ6TNEGT7XMATLTBJO5DINEQDOFPPSCFCA # printf "EXITCODE %3d" $?; exit Jul 31, 2017 1:45:29 PM org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1$1
onClose SEVERE: onClose called but latch already finished. This indicates a bug in the kubernetes-plugin
Jul 31, 2017 1:45:29 PM org.csanchez.jenkins.plugins.kubernetes.KubernetesSlave _terminate INFO: Terminating Kubernetes instance for slave jenkins-slave-tgrfl-9qbfr Jul 31, 2017 1:45:29 PM org.csanchez.jenkins.plugins.kubernetes.KubernetesSlave _terminate INFO: Terminated Kubernetes instance for slave jenkins-slave-tgrfl-9qbfr EXITCODE 0Terminated Kubernetes instance for slave jenkins-slave-tgrfl-9qbfr Jul 31, 2017 1:45:29 PM org.csanchez.jenkins.plugins.kubernetes.KubernetesSlave _terminate INFO: Disconnected computer jenkins-slave-tgrfl-9qbfr Jul 31, 2017 1:45:29 PM org.jenkinsci.plugins.workflow.job.WorkflowRun finish INFO: marketing-site/chen2-master #6 completed: FAILURE

chenfli commented Jul 31, 2017

I am having similar behavior. I am using helm with the kubernetes-plugin 0.12. I'll post a bug on the jenkins bug tracker, and update here accordingly

https://issues.jenkins-ci.org/browse/JENKINS-45885

/home/jenkins # printf "EXITCODE %3d" $?; exit Jul 31, 2017 1:45:28 PM org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1$1 onClose SEVERE: onClose called but latch already finished. This indicates a bug in the kubernetes-plugin Jul 31, 2017 1:45:28 PM org.jenkinsci.plugins.durabletask.ProcessLiveness isAlive WARNING: org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@63f09283; decorates hudson.Launcher$RemoteLauncher@58a8b185 on hudson.remoting.Channel@428191f0:Channel to /10.2.13.41 does not seem able to determine whether processes are alive or not ^[[45;38REXITCODE 0/home/jenkins/workspace/marketing-site_chen2-master-JZLCNMYKV67ZVLSGU6NZ6TNEGT7XMATLTBJO5DINEQDOFPPSCFCA # printf "EXITCODE %3d" $?; exit Jul 31, 2017 1:45:29 PM org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1$1
onClose SEVERE: onClose called but latch already finished. This indicates a bug in the kubernetes-plugin
Jul 31, 2017 1:45:29 PM org.csanchez.jenkins.plugins.kubernetes.KubernetesSlave _terminate INFO: Terminating Kubernetes instance for slave jenkins-slave-tgrfl-9qbfr Jul 31, 2017 1:45:29 PM org.csanchez.jenkins.plugins.kubernetes.KubernetesSlave _terminate INFO: Terminated Kubernetes instance for slave jenkins-slave-tgrfl-9qbfr EXITCODE 0Terminated Kubernetes instance for slave jenkins-slave-tgrfl-9qbfr Jul 31, 2017 1:45:29 PM org.csanchez.jenkins.plugins.kubernetes.KubernetesSlave _terminate INFO: Disconnected computer jenkins-slave-tgrfl-9qbfr Jul 31, 2017 1:45:29 PM org.jenkinsci.plugins.workflow.job.WorkflowRun finish INFO: marketing-site/chen2-master #6 completed: FAILURE

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment