You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm facing deadlock when streaming data using Pump. The source is Aync Azure client from Microsoft (not sure it's relevant) which uses Netty thread pool underneath. By looking at Pump's dataHandler's where dead lock occurs I can hardly see how it can allow 2 different theads (vertx one and netty one in my case) to execute concurrently. My code is trivial same as in documentation
ReactiveReadStream<Buffer> rrs = ReactiveReadStream.readStream();
// Subscribe the read stream to the publisherotherPublisher.subscribe(rrs);
// Pump from the read stream to the http responsePumppump = Pump.pump(rrs, response);
pump.start();
so I'm wondering what am I missing, might it be a bug in Vert.x?
FoundoneJava-leveldeadlock:
=============================
"nioEventLoopGroup-2-13":
waitingtolockmonitor0x000000002294b9b8 (object0x000000076b92c3a0, aio.vertx.core.http.impl.Http1xServerConnection),
whichisheldby"vert.x-eventloop-thread-0"
"vert.x-eventloop-thread-0": waiting to lock monitor 0x000000002261fe28 (object 0x000000076b44df28, a io.vertx.ext.reactivestreams.impl.ReactiveReadStreamImpl), which is held by "nioEventLoopGroup-2-13"Java stack information for the threads listed above:==================================================="nioEventLoopGroup-2-13": at io.vertx.core.http.impl.HttpServerResponseImpl.writeQueueFull(HttpServerResponseImpl.java:233) - waiting to lock <0x000000076b92c3a0> (a io.vertx.core.http.impl.Http1xServerConnection) at io.vertx.core.streams.impl.PumpImpl.lambda$new$1(PumpImpl.java:61) at io.vertx.core.streams.impl.PumpImpl$$Lambda$251/498218019.handle(Unknown Source) at io.vertx.ext.reactivestreams.impl.ReactiveReadStreamImpl.handleData(ReactiveReadStreamImpl.java:131) - eliminated <0x000000076b44df28> (a io.vertx.ext.reactivestreams.impl.ReactiveReadStreamImpl) at io.vertx.ext.reactivestreams.impl.ReactiveReadStreamImpl.onNext(ReactiveReadStreamImpl.java:101) - locked <0x000000076b44df28> (a io.vertx.ext.reactivestreams.impl.ReactiveReadStreamImpl) at reactor.core.publisher.StrictSubscriber.onNext(StrictSubscriber.java:89) at reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:108) at reactor.core.publisher.MonoFlatMapMany$FlatMapManyInner.onNext(MonoFlatMapMany.java:238) at io.reactivex.internal.util.HalfSerializer.onNext(HalfSerializer.java:45) at io.reactivex.internal.subscribers.StrictSubscriber.onNext(StrictSubscriber.java:97) at io.reactivex.internal.operators.flowable.FlowableDoOnEach$DoOnEachSubscriber.onNext(FlowableDoOnEach.java:92) at io.reactivex.internal.operators.flowable.FlowableTimeoutTimed$TimeoutSubscriber.onNext(FlowableTimeoutTimed.java:99) at io.reactivex.internal.operators.flowable.FlowableOnErrorNext$OnErrorNextSubscriber.onNext(FlowableOnErrorNext.java:69) at io.reactivex.internal.operators.flowable.FlowableDoOnEach$DoOnEachSubscriber.onNext(FlowableDoOnEach.java:92) at io.reactivex.internal.operators.flowable.FlowableMap$MapSubscriber.onNext(FlowableMap.java:69) at com.microsoft.rest.v2.http.NettyClient$ResponseContentFlowable.emitContent(NettyClient.java:757) at com.microsoft.rest.v2.http.NettyClient$ResponseContentFlowable.drain(NettyClient.java:722) at com.microsoft.rest.v2.http.NettyClient$ResponseContentFlowable.onReceivedContent(NettyClient.java:645) at com.microsoft.rest.v2.http.NettyClient$HttpClientInboundHandler.channelRead(NettyClient.java:870) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:438) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:253) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:141) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:886) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:748)"vert.x-eventloop-thread-0":
at io.vertx.ext.reactivestreams.impl.ReactiveReadStreamImpl.resume(ReactiveReadStreamImpl.java:62)
- waiting to lock <0x000000076b44df28> (a io.vertx.ext.reactivestreams.impl.ReactiveReadStreamImpl)
at io.vertx.ext.reactivestreams.impl.ReactiveReadStreamImpl.resume(ReactiveReadStreamImpl.java:30)
at io.vertx.core.streams.impl.PumpImpl.lambda$new$0(PumpImpl.java:57)
at io.vertx.core.streams.impl.PumpImpl$$Lambda$250/1955684275.handle(Unknown Source)
at io.vertx.core.http.impl.HttpServerResponseImpl.handleDrained(HttpServerResponseImpl.java:513)
- locked <0x000000076b92c3a0> (a io.vertx.core.http.impl.Http1xServerConnection)
at io.vertx.core.http.impl.Http1xServerConnection.handleInterestedOpsChanged(Http1xServerConnection.java:313)
- locked <0x000000076b92c3a0> (a io.vertx.core.http.impl.Http1xServerConnection)
at io.vertx.core.http.impl.HttpServerResponseImpl.lambda$drainHandler$0(HttpServerResponseImpl.java:246)
at io.vertx.core.http.impl.HttpServerResponseImpl$$Lambda$290/134406675.handle(Unknown Source)
at io.vertx.core.impl.ContextImpl.lambda$wrapTask$2(ContextImpl.java:339)
at io.vertx.core.impl.ContextImpl$$Lambda$12/1147805316.run(Unknown Source)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute$$$capture(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:886)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
Found 1 deadlock.
The text was updated successfully, but these errors were encountered:
Hi, I'm facing deadlock when streaming data using Pump. The source is Aync Azure client from Microsoft (not sure it's relevant) which uses Netty thread pool underneath. By looking at Pump's dataHandler's where dead lock occurs I can hardly see how it can allow 2 different theads (vertx one and netty one in my case) to execute concurrently. My code is trivial same as in documentation
so I'm wondering what am I missing, might it be a bug in Vert.x?
The text was updated successfully, but these errors were encountered: