New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LoggingHandler hangs the thread #2796

Open
sparhomenko opened this Issue Jan 20, 2018 · 19 comments

Comments

Projects
None yet
9 participants
@sparhomenko

sparhomenko commented Jan 20, 2018

Our application is running on Google Kubernetes Engine using gcr.io/google-appengine/jetty image and used com.google.cloud.logging.LoggingHandler to publish logs on Stackdriver. We noticed some worker threads becoming unresponsive over time. When the pod is shutting down we can see the following exception for each:

java.lang.RuntimeException: java.lang.InterruptedException
	at com.google.cloud.logging.LoggingImpl.flush(LoggingImpl.java:545)
	at com.google.cloud.logging.LoggingImpl.write(LoggingImpl.java:525)
	at com.google.cloud.logging.LoggingHandler.publish(LoggingHandler.java:273)
	at java.util.logging.Logger.log(Logger.java:738)
	at org.slf4j.impl.JDK14LoggerAdapter.log(JDK14LoggerAdapter.java:582)
	at org.slf4j.impl.JDK14LoggerAdapter.error(JDK14LoggerAdapter.java:500)
        ...
Caused by: java.lang.InterruptedException
	at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:449)
	at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:79)
	at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:63)
	at com.google.cloud.logging.LoggingImpl.flush(LoggingImpl.java:543)
	... 30 more

We'll try to extract a thread dump to see why the future never completes, but the issue seems dangerous by itself: LoggingImpl.java:543 uses the non-timeout version of Future.get() which can cause any logger call to block the current thread forever unless interrupted. Would it be possible to use the timeout version with a reasonably big timeout, e.g. 60 seconds?

@pongad pongad self-assigned this Jan 22, 2018

@pongad

This comment has been minimized.

Collaborator

pongad commented Jan 22, 2018

This seems related to #1795. The underlying client provides no way to initiate a flush.

@sparhomenko Do you have a sense of how long we're waiting? The above bug might cause us to wait a few seconds, but it shouldn't cause a prolonged hang. I agree that the wait with no timeout is a problem, but I want to understand the problem better before sending fixes.

@sparhomenko

This comment has been minimized.

sparhomenko commented Jan 22, 2018

@pongad Longest one we observed so far was 2.5 hours until manually interrupted. We're not sure what caused the future to hang for so long. The issue seems to be intermittently reproducible, I'll try to get more diagnostic information next time we hit it.

@pongad

This comment has been minimized.

Collaborator

pongad commented Jan 25, 2018

@sparhomenko Yikes! 2.5 hours is WAY beyond reasonable. I now suspect some kind of propagation problem somewhere. Maybe we don't properly complete the future for something. I'll try to dig into this as well, but a repro would be greatly appreciated.

@tamersaleh

This comment has been minimized.

tamersaleh commented Feb 11, 2018

Hi @pongad, Im now about to use google cloud logging API for logging in a multiThreaded java app, it'll be having near to 5k threads running concurrently, so any updates plz?

@pongad

This comment has been minimized.

Collaborator

pongad commented Feb 12, 2018

@tamersaleh I unfortunately don't have concrete progress to share yet. We plan to simplify the batching logic. The simplification should let us smoke out many of these bugs.

@sparhomenko

This comment has been minimized.

sparhomenko commented Feb 12, 2018

@pongad I'm afraid we are not able to reproduce the issue anymore. The only useful thing we found was that on the pods where this was reproducible LoggingHandler actually stopped providing any logs after this exception occurred:

java.lang.RuntimeException: com.google.cloud.logging.LoggingException: io.grpc.StatusRuntimeException: UNAVAILABLE: 503:Service Unavailable
	at com.google.cloud.logging.LoggingImpl$7.onFailure(LoggingImpl.java:577)
	at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:52)
	at com.google.common.util.concurrent.Futures$6.run(Futures.java:1764)
	at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:456)
	at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:817)
	at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:753)
	at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:634)
	at com.google.common.util.concurrent.Futures$AbstractChainingFuture.run(Futures.java:1405)
	at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:456)
	at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:817)
	at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:753)
	at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:634)
	at com.google.common.util.concurrent.Futures$AbstractCatchingFuture.run(Futures.java:794)
	at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:456)
	at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:817)
	at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:753)
	at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:634)
	at com.google.api.core.AbstractApiFuture$InternalSettableFuture.setException(AbstractApiFuture.java:96)
	at com.google.api.core.AbstractApiFuture.setException(AbstractApiFuture.java:78)
	at com.google.api.gax.grpc.BatchedRequestIssuer.sendResult(BatchedRequestIssuer.java:68)
	at com.google.api.gax.grpc.BatchExecutor$1.onFailure(BatchExecutor.java:96)
	at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:52)
	at com.google.common.util.concurrent.Futures$6.run(Futures.java:1764)
	at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:456)
	at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:817)
	at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:753)
	at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:634)
	at com.google.api.gax.retrying.RetryingFutureImpl.executeAttempt(RetryingFutureImpl.java:142)
	at com.google.api.gax.retrying.RetryingFutureImpl.access$500(RetryingFutureImpl.java:58)
	at com.google.api.gax.retrying.RetryingFutureImpl$AttemptFutureCallback.onFailure(RetryingFutureImpl.java:176)
	at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:52)
	at com.google.common.util.concurrent.Futures$6.run(Futures.java:1764)
	at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:456)
	at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:817)
	at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:753)
	at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:634)
	at com.google.api.core.AbstractApiFuture$InternalSettableFuture.setException(AbstractApiFuture.java:96)
	at com.google.api.core.AbstractApiFuture.setException(AbstractApiFuture.java:78)
	at com.google.api.gax.grpc.ExceptionTransformingCallable$ExceptionTransformingFuture.onFailure(ExceptionTransformingCallable.java:108)
	at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:52)
	at com.google.common.util.concurrent.Futures$6.run(Futures.java:1764)
	at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:456)
	at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:817)
	at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:753)
	at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:634)
	at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:463)
	at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:439)
	at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:428)
	at io.grpc.internal.ClientCallImpl.access$100(ClientCallImpl.java:76)
	at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:514)
	at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$700(ClientCallImpl.java:431)
	at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:546)
	at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:52)
	at io.grpc.internal.SerializingExecutor$TaskRunner.run(SerializingExecutor.java:152)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)
Caused by: com.google.cloud.logging.LoggingException: io.grpc.StatusRuntimeException: UNAVAILABLE: 503:Service Unavailable
	at com.google.cloud.logging.spi.v2.GrpcLoggingRpc$1.apply(GrpcLoggingRpc.java:141)
	at com.google.cloud.logging.spi.v2.GrpcLoggingRpc$1.apply(GrpcLoggingRpc.java:135)
	at com.google.api.core.ApiFutures$GaxFunctionToGuavaFunction.apply(ApiFutures.java:124)
	at com.google.common.util.concurrent.Futures$CatchingFuture.doFallback(Futures.java:842)
	at com.google.common.util.concurrent.Futures$CatchingFuture.doFallback(Futures.java:834)
	at com.google.common.util.concurrent.Futures$AbstractCatchingFuture.run(Futures.java:789)
	... 48 more
Caused by: com.google.api.gax.grpc.ApiException: io.grpc.StatusRuntimeException: UNAVAILABLE: 503:Service Unavailable
	... 23 more
Caused by: io.grpc.StatusRuntimeException: UNAVAILABLE: 503:Service Unavailable
	at io.grpc.Status.asRuntimeException(Status.java:540)
	... 15 more

These might have been caused by us producing too many logs, which we are not doing anymore.

One other suspicious thing is that latest version of google-appengine Docker image we use is still using a pretty old version of google-cloud-logging and related dependencies. So the issue might have been triggered by a problem in that version, or by a classpath conflict between the provided dependencies and the newer ones we package in our WAR.

@dansomething

This comment has been minimized.

dansomething commented Feb 12, 2018

FWIW, we have experienced similar issues using both JUL and Logback as our client for google-cloud-logging. The hangs seem to correlate with GRPC connection issues. We have sadly never been able to reliably reproduce the issue. We eventually had to give up on using the "Enhanced Stackdriver Logging" feature of the jetty-runtime.

Here some stracktraces from our logs if that helps.

java.lang.RuntimeException: com.google.cloud.logging.LoggingException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 49999922530ns at com.google.cloud.logging.LoggingImpl$7.onFailure(LoggingImpl.java:577) at
 com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:61) at
 com.google.common.util.concurrent.Futures$4.run(Futures.java:1123) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.common.util.concurrent.AbstractTransformFuture.run(AbstractTransformFuture.java:112) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.common.util.concurrent.AbstractCatchingFuture.run(AbstractCatchingFuture.java:134) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.api.core.AbstractApiFuture$InternalSettableFuture.setException(AbstractApiFuture.java:95) at
 com.google.api.core.AbstractApiFuture.setException(AbstractApiFuture.java:77) at
 com.google.api.gax.rpc.BatchedFuture.setException(BatchedFuture.java:55) at
 com.google.api.gax.rpc.BatchedRequestIssuer.sendResult(BatchedRequestIssuer.java:82) at
 com.google.api.gax.rpc.BatchExecutor$1.onFailure(BatchExecutor.java:96) at
 com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:61) at
 com.google.common.util.concurrent.Futures$4.run(Futures.java:1123) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.api.gax.retrying.BasicRetryingFuture.handleAttempt(BasicRetryingFuture.java:157) at
 com.google.api.gax.retrying.CallbackChainRetryingFuture$AttemptCompletionListener.handle(CallbackChainRetryingFuture.java:134) at
 com.google.api.gax.retrying.CallbackChainRetryingFuture$AttemptCompletionListener.run(CallbackChainRetryingFuture.java:116) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.api.core.AbstractApiFuture$InternalSettableFuture.setException(AbstractApiFuture.java:95) at
 com.google.api.core.AbstractApiFuture.setException(AbstractApiFuture.java:77) at
 com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.setException(GrpcExceptionCallable.java:117) at
 com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:101) at
 com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:61) at
 com.google.common.util.concurrent.Futures$4.run(Futures.java:1123) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:458) at
 io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:433) at
 io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:41) at
 io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:339) at
 io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:443) at
 io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:63) at
 io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:525) at
 io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$600(ClientCallImpl.java:446) at
 io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:557) at
 io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37) at
 io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:107) at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at
 java.util.concurrent.FutureTask.run(FutureTask.java:266) at
 java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at
 java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at
 java.lang.Thread.run(Thread.java:748) Caused by: com.google.cloud.logging.LoggingException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 49999922530ns at com.google.cloud.logging.spi.v2.GrpcLoggingRpc$2.apply(GrpcLoggingRpc.java:155) at
 com.google.cloud.logging.spi.v2.GrpcLoggingRpc$2.apply(GrpcLoggingRpc.java:149) at
 com.google.api.core.ApiFutures$GaxFunctionToGuavaFunction.apply(ApiFutures.java:153) at
 com.google.common.util.concurrent.AbstractCatchingFuture$CatchingFuture.doFallback(AbstractCatchingFuture.java:205) at
 com.google.common.util.concurrent.AbstractCatchingFuture$CatchingFuture.doFallback(AbstractCatchingFuture.java:193) at
 com.google.common.util.concurrent.AbstractCatchingFuture.run(AbstractCatchingFuture.java:132)
java.lang.RuntimeException: com.google.cloud.logging.LoggingException: io.grpc.StatusRuntimeException: UNAVAILABLE: Authentication backend unavailable. at com.google.cloud.logging.LoggingImpl$7.onFailure(LoggingImpl.java:577) at
 com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:61) at
 com.google.common.util.concurrent.Futures$4.run(Futures.java:1123) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.common.util.concurrent.AbstractTransformFuture.run(AbstractTransformFuture.java:112) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.common.util.concurrent.AbstractCatchingFuture.run(AbstractCatchingFuture.java:134) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.api.core.AbstractApiFuture$InternalSettableFuture.setException(AbstractApiFuture.java:95) at
 com.google.api.core.AbstractApiFuture.setException(AbstractApiFuture.java:77) at
 com.google.api.gax.rpc.BatchedFuture.setException(BatchedFuture.java:55) at
 com.google.api.gax.rpc.BatchedRequestIssuer.sendResult(BatchedRequestIssuer.java:82) at
 com.google.api.gax.rpc.BatchExecutor$1.onFailure(BatchExecutor.java:96) at
 com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:61) at
 com.google.common.util.concurrent.Futures$4.run(Futures.java:1123) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.api.gax.retrying.BasicRetryingFuture.handleAttempt(BasicRetryingFuture.java:157) at
 com.google.api.gax.retrying.CallbackChainRetryingFuture$AttemptCompletionListener.handle(CallbackChainRetryingFuture.java:134) at
 com.google.api.gax.retrying.CallbackChainRetryingFuture$AttemptCompletionListener.run(CallbackChainRetryingFuture.java:116) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 com.google.api.core.AbstractApiFuture$InternalSettableFuture.setException(AbstractApiFuture.java:95) at
 com.google.api.core.AbstractApiFuture.setException(AbstractApiFuture.java:77) at
 com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.setException(GrpcExceptionCallable.java:117) at
 com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:101) at
 com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:61) at
 com.google.common.util.concurrent.Futures$4.run(Futures.java:1123) at
 com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) at
 com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900) at
 com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811) at
 com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675) at
 io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:458) at
 io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:433) at
 io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:41) at
 io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:339) at
 io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:443) at
 io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:63) at
 io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:525) at
 io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$600(ClientCallImpl.java:446) at
 io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:557) at
 io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37) at
 io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:107) at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at
 java.util.concurrent.FutureTask.run(FutureTask.java:266) at
 java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at
 java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at
 java.lang.Thread.run(Thread.java:748) Caused by: com.google.cloud.logging.LoggingException: io.grpc.StatusRuntimeException: UNAVAILABLE: Authentication backend unavailable. at com.google.cloud.logging.spi.v2.GrpcLoggingRpc$2.apply(GrpcLoggingRpc.java:155) at
 com.google.cloud.logging.spi.v2.GrpcLoggingRpc$2.apply(GrpcLoggingRpc.java:149) at
 com.google.api.core.ApiFutures$GaxFunctionToGuavaFunction.apply(ApiFutures.java:153) at
 com.google.common.util.concurrent.AbstractCatchingFuture$CatchingFuture.doFallback(AbstractCatchingFuture.java:205) at
 com.google.common.util.concurrent.AbstractCatchingFuture$CatchingFuture.doFallback(AbstractCatchingFuture.java:193) at
 com.google.common.util.concurrent.AbstractCatchingFuture.run(AbstractCatchingFuture.java:132)
@tamersaleh

This comment has been minimized.

tamersaleh commented Feb 13, 2018

@pongad my experience with cloud logging is just that, when I'm trying to call logging.write(Collections.singleton(logEntry)), it doesn't seem like its called asynchronously, and it makes 90% of my threads in a BLOCKED state

@pongad

This comment has been minimized.

Collaborator

pongad commented Feb 13, 2018

The method taking 1min to timeout is curious. That's the deadline we set on our RPCs but Logging shouldn't take that long. This might be related to threads being in BLOCKED state; if threads are blocked, they can't service RPC.

Could you share with us the stack trace of the BLOCKED threads?

@garrettjonesgoogle FYI

@tamersaleh

This comment has been minimized.

tamersaleh commented Feb 14, 2018

some of them BLOCKED, and others are WAITING (parking)

sample of "WAITING (parking)":

"user_mini_table_76c0c70d45e05cc_616" #838 prio=10 os_prio=0 tid=0x00007f511c494000 nid=0x541b waiting on condition [0x00007eae3bddf000]
   java.lang.Thread.State: WAITING (parking)
        at sun.misc.Unsafe.park(Native Method)
        - parking to wait for  <0x00000003c0478268> (a java.util.concurrent.locks.ReentrantLock$NonfairSync)
        at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireQueued(AbstractQueuedSynchronizer.java:870)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquire(AbstractQueuedSynchronizer.java:1199)
        at java.util.concurrent.locks.ReentrantLock$NonfairSync.lock(ReentrantLock.java:209)
        at java.util.concurrent.locks.ReentrantLock.lock(ReentrantLock.java:285)
        at com.google.api.gax.batching.ThresholdBatcher.add(ThresholdBatcher.java:167)
        at com.google.api.gax.rpc.BatchingCallable.futureCall(BatchingCallable.java:73)
        at com.google.api.gax.rpc.UnaryCallable$1.futureCall(UnaryCallable.java:126)
        at com.google.api.gax.rpc.UnaryCallable.futureCall(UnaryCallable.java:87)
        at com.google.cloud.logging.spi.v2.GrpcLoggingRpc.write(GrpcLoggingRpc.java:223)
        at com.google.cloud.logging.LoggingImpl.writeAsync(LoggingImpl.java:593)
        at com.google.cloud.logging.LoggingImpl.writeLogEntries(LoggingImpl.java:559)
        at com.google.cloud.logging.LoggingImpl.write(LoggingImpl.java:522)
        at com.dating.util.CloudLogger.log(CloudLogger.java:106)
        at com.dating.util.CloudLogger.logInfo(CloudLogger.java:83)
        at com.dating.util.Matching.getMyBestMatch(Matching.java:120)
        at com.harmonica.match.FindMyMaleMatchService.getMatches(FindMyMaleMatchService.java:61)
        at com.harmonica.match.FindMyMaleMatchService.findMalesForGirls(FindMyMaleMatchService.java:40)
        at com.harmonica.match.MatchGroupService.lambda$getMatchBatch$2(MatchGroupService.java:236)
        at com.harmonica.match.MatchGroupService$$Lambda$24/1081662793.call(Unknown Source)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

sample of "BLOCKED":

"user_mini_table_76c0c70d45e05cc_615" #837 prio=10 os_prio=0 tid=0x00007f511c492000 nid=0x541a in Object.wait() [0x00007eae7bde0000]
   java.lang.Thread.State: BLOCKED (on object monitor)
        at java.lang.Object.wait(Native Method)
        at java.lang.Object.wait(Object.java:502)
        at com.google.api.gax.batching.BlockingSemaphore.acquire(BlockingSemaphore.java:61)
        - locked <0x00000003c04781e0> (a com.google.api.gax.batching.BlockingSemaphore)
        at com.google.api.gax.batching.FlowController.reserve(FlowController.java:197)
        at com.google.api.gax.batching.BatchingFlowController.reserve(BatchingFlowController.java:58)
        at com.google.api.gax.batching.ThresholdBatcher.add(ThresholdBatcher.java:166)
        at com.google.api.gax.rpc.BatchingCallable.futureCall(BatchingCallable.java:73)
        at com.google.api.gax.rpc.UnaryCallable$1.futureCall(UnaryCallable.java:126)
        at com.google.api.gax.rpc.UnaryCallable.futureCall(UnaryCallable.java:87)
        at com.google.cloud.logging.spi.v2.GrpcLoggingRpc.write(GrpcLoggingRpc.java:223)
        at com.google.cloud.logging.LoggingImpl.writeAsync(LoggingImpl.java:593)
        at com.google.cloud.logging.LoggingImpl.writeLogEntries(LoggingImpl.java:559)
        at com.google.cloud.logging.LoggingImpl.write(LoggingImpl.java:522)
        at com.dating.util.CloudLogger.log(CloudLogger.java:106)
        at com.dating.util.CloudLogger.logInfo(CloudLogger.java:83)
        at com.dating.util.Matching.getMyBestMatch(Matching.java:120)
        at com.harmonica.match.FindMyMaleMatchService.getMatches(FindMyMaleMatchService.java:61)
        at com.harmonica.match.FindMyMaleMatchService.findMalesForGirls(FindMyMaleMatchService.java:40)
        at com.harmonica.match.MatchGroupService.lambda$getMatchBatch$2(MatchGroupService.java:236)
        at com.harmonica.match.MatchGroupService$$Lambda$24/1081662793.call(Unknown Source)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

the line of (CloudLogger.java:106) is actually: logging.write(Collections.singleton(logEntry));

@pongad

This comment has been minimized.

Collaborator

pongad commented Feb 14, 2018

@tamersaleh Both of these are somewhat expected.

For WAITING, the thread seems to be waiting to lock a shared structure. I'd expect the number to be relatively small though, since the threads should update the structure then unlock quickly.

For the BLOCKED, that's caused by Logging throttling the application because there are too many logs waiting to be sent. This usually happens when the logs are generated faster than they can be sent to the server.

I assume you're not deadlocked, but progressing slowly? Do you have a sense of how many logs you're sending and how large they are? If the RPCs are too slow, there are some enhancements we can make, like opening multiple connections, etc.

@tamersaleh

This comment has been minimized.

tamersaleh commented Feb 14, 2018

@pongad
yes thats exactly what I'm facing now, logs are generated in a speed > 100,000 entries / sec

@tamersaleh

This comment has been minimized.

tamersaleh commented Feb 14, 2018

@pongad
right now I did some changes I let threads send all logs to a ConcurrentLinkedQueue then a single thread polling logs from there and posting them in chunks of 4k entries by calling logging.write(entries) and this single thread is scheduled to run every 2 sec but even thought this is not fast enough, Im thinking if eh RPC is fast enough I can enlarge this chunk may be to 50k or may be more

@tamersaleh

This comment has been minimized.

tamersaleh commented Feb 14, 2018

@pongad I've got an idea,
what about I call POST https://logging.googleapis.com/v2/entries:write directly ?
can I post concurrent calls and batches calls faster than the client lib ?

@tamersaleh

This comment has been minimized.

tamersaleh commented Feb 15, 2018

@pongad thanks alot, so when this new change can be in maven repository?

@pongad pongad removed their assignment Jul 16, 2018

@JustinBeckwith JustinBeckwith added the 🚨 label Jul 19, 2018

@csainty

This comment has been minimized.

Contributor

csainty commented Aug 23, 2018

What is the status here?

Our application is taken down almost every night by the RuntimeException being thrown deep inside the async logging handler - https://github.com/GoogleCloudPlatform/google-cloud-java/blob/master/google-cloud-clients/google-cloud-logging/src/main/java/com/google/cloud/logging/LoggingImpl.java#L578

How are we supposed to catch and handle this exception since it isn't in our own callstack?

@pongad

This comment has been minimized.

Collaborator

pongad commented Sep 20, 2018

The immediate issue should be resolved now (by bumping gax dependency). There should probably be a way to configure logging clients to drop logs instead of blocking for some applications. Turning this into feature request.

@michael-barker

This comment has been minimized.

michael-barker commented Oct 5, 2018

I have a sample project that hangs every time log.error(...) is called. This happens when running locally and deployed to GKE. I also tried bumping gax and gax-rpc to 1.32.0 but that didn't seem to help. This is a WebFlux app but appears to happen regardless of whether I use WebFlux or WebMVC.

edit: Look like I don't need spring-cloud-gcp-starter-logging in order for logs to be sent to Stackdriver. Remove that fixed the issue for me. Maybe the sample project could still be useful though for reproducing the original issue.

@sduskis sduskis removed the 🚨 label Oct 9, 2018

@sduskis sduskis removed the priority: p2 label Oct 9, 2018

@sergei-ivanov

This comment has been minimized.

sergei-ivanov commented Oct 19, 2018

Hi,

Having enabled Stackdriver logging (via spring-cloud-gcp-starter-logging and gcloud logback appender) about a week ago, we are facing recurring issues there.

First of all, it appears that whenever LoggingImpl has to flush, it is a blocking synchronous operation from the caller's perspective. I initially had an assumption that writeSynchronicity (which is set to ASYNC by default for logback) would ensure that all underlying API calls are offloaded to some background processing thread. It does not seem to be the case.

It would be great if all asynchronous and blocking operations (and configuration options, when available/applicable) were properly documented. Otherwise it involves a lot of guesswork on the part of library users.

So in order to mitigate sporadic delays caused by gcloud logging appender, we decided to wrap it inside an AsyncAppender. Unfortunately, it made matters worse. At some point the cloud backend failed with UNAVAILABLE: Authentication backend unavailable, which completely locked up the application's processing thread. I've done a thread dump (at the bottom of this message), and it looks like we effectively have a deadlock there.

  • The application thread is waiting on async appender, which has its buffer full and is blocked
  • The working thread of async appender is waiting for the gcloud logger to flush
  • The gRPC thread failed with an exception and tried to log that exception, which also resulted in it being blocked on async appender call!

So the application is in a state where a recovery is impossible.

My next step in trying to mitigate this will be increasing the queue size for AsyncAppender and enabling the neverBlock option on it. Which means some messages will be lost when the gcloud backend chokes up, but at least the application won't be deadlocked by that:

<configuration>

    <springProperty name="APP_NAME" source="spring.application.name" default="spring"/>

    <appender name="CLOUD" class="com.google.cloud.logging.logback.LoggingAppender">
        <flushLevel>WARN</flushLevel>
        <log>${APP_NAME}</log>
    </appender>
    <appender name="ASYNC_CLOUD" class="ch.qos.logback.classic.AsyncAppender">
        <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
            <level>INFO</level>
        </filter>
        <!-- May result in message loss when cloud backend is unavailable, but won't block the application -->
        <neverBlock>true</neverBlock>
        <queueSize>2048</queueSize>
        <appender-ref ref="CLOUD"/>
    </appender>

    <root level="DEBUG">
        <appender-ref ref="ASYNC_CLOUD"/>
        <!-- Other appenders here... -->
    </root>

</configuration>

Again, maybe it's worth updating the documentation (both the project's readme file, and the stackdriver docs) to include the recommended AsyncAppender settings.

And here comes a thread dump:

"http-nio-8080-exec-1" #58 daemon prio=5 os_prio=0 tid=0x00007f388ff7e000 nid=0x754d waiting on condition [0x00007f3834
   java.lang.Thread.State: WAITING (parking)
        at sun.misc.Unsafe.park(Native Method)
        - parking to wait for  <0x0000000733d7e4b0> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionO
        at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:
        at java.util.concurrent.ArrayBlockingQueue.put(ArrayBlockingQueue.java:353)
        at ch.qos.logback.core.AsyncAppenderBase.putUninterruptibly(AsyncAppenderBase.java:181)
        at ch.qos.logback.core.AsyncAppenderBase.put(AsyncAppenderBase.java:172)
        at ch.qos.logback.core.AsyncAppenderBase.append(AsyncAppenderBase.java:161)
        at ch.qos.logback.core.UnsynchronizedAppenderBase.doAppend(UnsynchronizedAppenderBase.java:84)
        at ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51)
        at ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270)
        at ch.qos.logback.classic.Logger.callAppenders(Logger.java:257)
        at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421)
        at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383)
        at ch.qos.logback.classic.Logger.warn(Logger.java:688)
        at <application code>

"AsyncAppender-Worker-ASYNC_CLOUD" #15 daemon prio=5 os_prio=0 tid=0x00007f388cd28000 nid=0x7512 waiting on condition [
   java.lang.Thread.State: WAITING (parking)
        at sun.misc.Unsafe.park(Native Method)
        - parking to wait for  <0x000000079bba6b18> (a com.google.common.util.concurrent.CollectionFuture$ListFuture)
        at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:471)
        at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:78)
        at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:62)
        at com.google.cloud.logging.LoggingImpl.flush(LoggingImpl.java:544)
        at com.google.cloud.logging.LoggingImpl.write(LoggingImpl.java:526)
        at com.google.cloud.logging.logback.LoggingAppender.append(LoggingAppender.java:201)
        at com.google.cloud.logging.logback.LoggingAppender.append(LoggingAppender.java:63)
        at ch.qos.logback.core.UnsynchronizedAppenderBase.doAppend(UnsynchronizedAppenderBase.java:84)
        at ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51)
        at ch.qos.logback.core.AsyncAppenderBase$Worker.run(AsyncAppenderBase.java:290)
   Locked ownable synchronizers:
        - None

"Gax-2" #16 daemon prio=5 os_prio=0 tid=0x00007f3830410000 nid=0x7522 waiting on condition [0x00007f3870f53000]
   java.lang.Thread.State: WAITING (parking)
        at sun.misc.Unsafe.park(Native Method)
        - parking to wait for  <0x0000000733d7e4b0> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionO
        at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:
        at java.util.concurrent.ArrayBlockingQueue.put(ArrayBlockingQueue.java:353)
        at ch.qos.logback.core.AsyncAppenderBase.putUninterruptibly(AsyncAppenderBase.java:181)
        at ch.qos.logback.core.AsyncAppenderBase.put(AsyncAppenderBase.java:172)
        at ch.qos.logback.core.AsyncAppenderBase.append(AsyncAppenderBase.java:161)
        at ch.qos.logback.core.UnsynchronizedAppenderBase.doAppend(UnsynchronizedAppenderBase.java:84)
        at ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51)
        at ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270)
        at ch.qos.logback.classic.Logger.callAppenders(Logger.java:257)
        at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421)
        at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383)
        at ch.qos.logback.classic.Logger.log(Logger.java:765)
        at org.slf4j.bridge.SLF4JBridgeHandler.callLocationAwareLogger(SLF4JBridgeHandler.java:221)
        at org.slf4j.bridge.SLF4JBridgeHandler.publish(SLF4JBridgeHandler.java:303)
        at java.util.logging.Logger.log(Logger.java:738)
        at java.util.logging.Logger.doLog(Logger.java:765)
        at java.util.logging.Logger.log(Logger.java:876)
        at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:907)
        at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:813)
        at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:677)
        at com.google.common.util.concurrent.AbstractTransformFuture.run(AbstractTransformFuture.java:112)
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:399)
        at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:902)
        at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:813)
        at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:677)
        at com.google.common.util.concurrent.AbstractCatchingFuture.run(AbstractCatchingFuture.java:134)
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:399)
        at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:902)
        at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:813)
        at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:677)
        at com.google.api.core.AbstractApiFuture$InternalSettableFuture.setException(AbstractApiFuture.java:95)
        at com.google.api.core.AbstractApiFuture.setException(AbstractApiFuture.java:77)
        at com.google.api.gax.rpc.BatchedFuture.setException(BatchedFuture.java:55)
        at com.google.api.gax.rpc.BatchedRequestIssuer.sendResult(BatchedRequestIssuer.java:84)
        at com.google.api.gax.rpc.BatchExecutor$1.onFailure(BatchExecutor.java:96)
        at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:61)
        at com.google.common.util.concurrent.Futures$4.run(Futures.java:1126)
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:399)
        at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:902)
        at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:813)
        at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:677)
        at com.google.api.core.AbstractApiFuture$InternalSettableFuture.setException(AbstractApiFuture.java:95)
        at com.google.api.core.AbstractApiFuture.setException(AbstractApiFuture.java:77)
        at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.ja
        at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:61)
        at com.google.common.util.concurrent.Futures$4.run(Futures.java:1126)
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:399)
        at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:902)
        at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:813)
        at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:677)
        at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:493)
        at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:468)
        at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
        at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
        at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener
        at io.grpc.internal.CensusStatsModule$StatsClientInterceptor$1$1.onClose(CensusStatsModule.java:684)
        at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
        at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
        at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener
        at io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:403)
        at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:459)
        at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:63)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:546)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$600(ClientCallImpl.java:467)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:584)
        at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:29
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   Locked ownable synchronizers:
        - <0x00000007343591f8> (a java.util.concurrent.ThreadPoolExecutor$Worker)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment