You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Implement Writable with lots of data as output and flushing, such us
@Get(value = "/test")
public Writable getData() {
return writer -> {
for (int i = 0; i < 500_000_000; i++) {
writer.write("Hello");
if (i % 5_000_000 == 0) {
log.info("requesting flush");
writer.flush();
}
}
};
}
Do curl against that api: curl localhost:8080/test >/dev/null
Observe statistics reported by curl, and compare those to flush requests reported in logs
Expected Behaviour
Curl gets data as it is being flushed.
Actual Behaviour
No data is sent and eventually OOM is thrown
Exception in thread "io-executor-thread-1" java.lang.OutOfMemoryError: Direct buffer memory
at java.base/java.nio.Bits.reserveMemory(Bits.java:175)
at java.base/java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:118)
at java.base/java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:317)
at io.netty.buffer.PoolArena$DirectArena.allocateDirect(PoolArena.java:645)
at io.netty.buffer.PoolArena$DirectArena.newUnpooledChunk(PoolArena.java:635)
at io.netty.buffer.PoolArena.allocateHuge(PoolArena.java:215)
at io.netty.buffer.PoolArena.allocate(PoolArena.java:143)
at io.netty.buffer.PoolArena.reallocate(PoolArena.java:288)
at io.netty.buffer.PooledByteBuf.capacity(PooledByteBuf.java:118)
at io.netty.buffer.AbstractByteBuf.ensureWritable0(AbstractByteBuf.java:307)
at io.netty.buffer.AbstractByteBuf.ensureWritable(AbstractByteBuf.java:282)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1075)
at io.netty.buffer.ByteBufOutputStream.write(ByteBufOutputStream.java:67)
at java.base/sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:233)
at java.base/sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:312)
at java.base/sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:316)
at java.base/sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:153)
at java.base/java.io.OutputStreamWriter.flush(OutputStreamWriter.java:254)
at hello.world.HelloController.lambda$getJson8$8(HelloController.java:107)
at io.micronaut.core.io.Writable.writeTo(Writable.java:77)
at io.micronaut.http.server.netty.RoutingInBoundHandler.lambda$encodeHttpResponse$13(RoutingInBoundHandler.java:1542)
at io.micronaut.scheduling.instrument.InvocationInstrumenterWrappedRunnable.run(InvocationInstrumenterWrappedRunnable.java:47)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
I think the problem is in RoutingInBoundHandler that buffers responses before sending them out, but to me it seems there is no workaround to get around that.
The text was updated successfully, but these errors were encountered:
Micronaut already supports Jackson streaming just in a non-blocking manner. You can create a Flowable<Foo> response and emit chunks of JSON which will be streamed and written in a non-blocking way
For large amounts of Data you don't really want to use blocking constructs like Writer/OutputStream with Netty and you are better off emitting chunks for byte[] or whatever from a Flowable
If you REALLY REALLY want to use Writable / OutputStream for large responses you would probably be better off switching to a server implementation that better supports blocking I/O like micronaut-server-jetty which uses Jetty (a servlet container) for the server and will let you do what you want here.
Micronaut already supports Jackson streaming just in a non-blocking manner. You can create a Flowable response and emit chunks of JSON which will be streamed and written in a non-blocking way
Yes, but that will only work if the response is something like individual items/chunks or you split to them yourself. If you have a even a bit more complex item, such as in this stackoverflow question, it won't work. If Micronaut would support writing directly to output stream without buffering, both Writable flush and jackson streaming like this would be possible without going out of memory.
Micronaut-server-jetty was a good pointer, will have to check that out.
Steps to Reproduce
curl localhost:8080/test >/dev/null
Expected Behaviour
Curl gets data as it is being flushed.
Actual Behaviour
No data is sent and eventually OOM is thrown
Environment Information
Example Application
Note: Similar issue seems to also prevent jackson streaming api from working in this case. It works on Spring Boot.
This question originated from stackoverflow question Stream large response in micronaut controller without going out of memory.
I think the problem is in RoutingInBoundHandler that buffers responses before sending them out, but to me it seems there is no workaround to get around that.
The text was updated successfully, but these errors were encountered: