Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR org.opensearch.dataprepper.GrpcRequestExceptionHandler in dataprepper version 2.7.0 #4502

Open
Megharaj-N opened this issue May 6, 2024 · 5 comments
Labels
question Further information is requested

Comments

@Megharaj-N
Copy link

Megharaj-N commented May 6, 2024

Describe the bug
We have an Data Prepper & OpenTelemetry setup in our Kubernetes ecosystem running to collect metrics and traces and send the data to OpenSearch. This setup was working perfectly fine when we were running with below versions.

  Data Prepper - 2.6.0
  OpenTelemetry Collector - 0.83.0

Recently we had performed a version upgrade of dataprepper component only to remove certain vulnerabilities associated with image. The new versions we are on are as below

  Data Prepper -  2.7.0
  OpenTelemetry Collector - 0.83.0

However post this upgrade we are encountering below error in Data Prepper and metrics / traces are not reaching the Opensearch.

 [armeria-common-worker-epoll-3-2] ERROR org.opensearch.dataprepper.GrpcRequestExceptionHandler - Unexpected 
 exception handling gRPC request
 com.linecorp.armeria.common.stream.ClosedStreamException: received a RST_STREAM frame: CANCEL
 [pool-9-thread-94] ERROR org.opensearch.dataprepper.plugins.source.otelmetrics.OTelMetricsGrpcService - Failed to 
 write the request of size 120068 due to:
 java.util.concurrent.TimeoutException: Pipeline [otel-metrics-pipeline] - Buffer does not have enough capacity left for the 
 number of records: 286, timed out waiting for slots.
at org.opensearch.dataprepper.plugins.buffer.blockingbuffer.BlockingBuffer.doWriteAll(BlockingBuffer.java:127) ~ 
 [blocking-buffer-2.7.0.jar:?]
at org.opensearch.dataprepper.model.buffer.AbstractBuffer.writeAll(AbstractBuffer.java:107) ~[data-prepper-api-2.7.0.jar:?]
at org.opensearch.dataprepper.model.buffer.DelegatingBuffer.writeAll(DelegatingBuffer.java:48) ~[data-prepper-api-2.7.0.jar:?]
at org.opensearch.dataprepper.model.buffer.DelegatingBuffer.writeAll(DelegatingBuffer.java:48) ~[data-prepper-api-2.7.0.jar:?]
at org.opensearch.dataprepper.parser.CircuitBreakingBuffer.writeAll(CircuitBreakingBuffer.java:50) ~[data-prepper-core-2.7.0.jar:?]
at org.opensearch.dataprepper.plugins.source.otelmetrics.OTelMetricsGrpcService.processRequest(OTelMetricsGrpcService.java:97) ~[otel-metrics-source-2.7.0.jar:?]
at org.opensearch.dataprepper.plugins.source.otelmetrics.OTelMetricsGrpcService.lambda$export$0(OTelMetricsGrpcService.java:83) ~[otel-metrics-source-2.7.0.jar:?]
at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141) ~[micrometer-core-1.11.5.jar:1.11.5]
at org.opensearch.dataprepper.plugins.source.otelmetrics.OTelMetricsGrpcService.export(OTelMetricsGrpcService.java:83) ~[otel-metrics-source-2.7.0.jar:?]
at io.opentelemetry.proto.collector.metrics.v1.MetricsServiceGrpc$MethodHandlers.invoke(MetricsServiceGrpc.java:246) ~[opentelemetry-proto-0.16.0-alpha.jar:0.16.0]
at io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182) ~[grpc-stub-1.58.0.jar:1.58.0]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.invokeOnMessage(AbstractServerCall.java:387) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.lambda$onRequestMessage$2(AbstractServerCall.java:351) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.internal.shaded.guava.util.concurrent.SequentialExecutor$1.run(SequentialExecutor.java:125) [armeria-1.26.4.jar:?]
at com.linecorp.armeria.internal.shaded.guava.util.concurrent.SequentialExecutor$QueueWorker.workOnQueue(SequentialExecutor.java:237) [armeria-1.26.4.jar:?]
at com.linecorp.armeria.internal.shaded.guava.util.concurrent.SequentialExecutor$QueueWorker.run(SequentialExecutor.java:182) [armeria-1.26.4.jar:?]
at com.linecorp.armeria.common.DefaultContextAwareRunnable.run(DefaultContextAwareRunnable.java:45) [armeria-1.26.4.jar:?]
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?] 
   [armeria-common-worker-epoll-3-1] WARN  io.netty.util.concurrent.AbstractEventExecutor - A task raised an exception. 
   Task: com.linecorp.armeria.common.DefaultContextAwareRunnable@23d5df32
   java.lang.IllegalStateException: call already closed. status: Status{code=RESOURCE_EXHAUSTED, description=Pipeline 
  [otel-metrics-pipeline] - Buffer does not have enough capacity left for the number of records: 286, timed out waiting for 
  slots., cause=null}, exception: org.opensearch.dataprepper.exceptions.BufferWriteException: Pipeline [otel-metrics- 
  pipeline] - Buffer does not have enough capacity left for the number of records: 286, timed out waiting for slots.
at com.linecorp.armeria.internal.shaded.guava.base.Preconditions.checkState(Preconditions.java:835) ~[armeria-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.doClose(AbstractServerCall.java:245) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.lambda$close$1(AbstractServerCall.java:227) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.common.DefaultContextAwareRunnable.run(DefaultContextAwareRunnable.java:45) ~[armeria-1.26.4.jar:?]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:413) [netty-transport-classes-epoll-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]

To Reproduce

Bring up similar setup on kubernetes ecosystem to collect and send metrics & traces to Opensearch.

  Data Prepper -  2.7.0
  OpenTelemetry Collector - 0.83.0

Expected behavior

Metrics and traces collected from kubernetes need to flow to opensearch seamleslly,

@Megharaj-N Megharaj-N added bug Something isn't working untriaged labels May 6, 2024
@KarstenSchnitter
Copy link
Collaborator

Hi @Megharaj-N,

Thanks for reaching out.

Have you tried increasing the buffer in DataPrepper or reducing the metrics batch size? The error message is:

Status{code=RESOURCE_EXHAUSTED, description=Pipeline [otel-metrics-pipeline] - Buffer does not have enough capacity left for the number of records: 286, timed out waiting for slots., cause=null}

This indicates, that you are emitting 286 records, that need to be put into the metrics input buffer, but gets stuck, because the buffer is not emptied. That can happen, if the sink is slow or unreachable. Another option is that you are providing too larege batches, that do not fit into the buffer entirely.

Best Regards,
Karsten

@dlvenable
Copy link
Member

@Megharaj-N , In 2.7 we changed the behavior of the source to output individual metrics instead of the whole group. You can multiply your buffer size by what your typical batch size would be.

Say for example, you have 1,000 metrics per batch, multiply the buffer size by 1,000.

This is the PR that introduced this change: #4183

@dlvenable dlvenable added question Further information is requested and removed bug Something isn't working labels May 14, 2024
@Megharaj-N
Copy link
Author

Megharaj-N commented May 23, 2024

Thank you @KarstenSchnitter & @dlvenable , Buffer related error has stopped now after increasing buffer size and reducing batch size in dataprepper config. However, we are still encountering other error as mentioned below.
It seems to me that connection is getting dropped between opentelemetry and dataprepper and it has not completely resolved even after updating opentelemetry exporter's tls configuration as below.

      exporters:
        otlp/data-prepper:
          endpoint: data-prepper:218xx
          tls:
           insecure: true
           insecure_skip_verify: true

Error log snippet from dataprepper

  [armeria-common-worker-epoll-3-2] ERROR org.opensearch.dataprepper.GrpcRequestExceptionHandler - Unexpected 
  exception handling gRPC request
 com.linecorp.armeria.common.stream.ClosedStreamException: received a RST_STREAM frame: CANCEL
 [armeria-common-worker-epoll-3-2] WARN  io.netty.util.concurrent.AbstractEventExecutor - A task raised an exception. Task: 
 com.linecorp.armeria.common.DefaultContextAwareRunnable@4cd916e4
 java.lang.IllegalStateException: call is closed
at com.linecorp.armeria.internal.shaded.guava.base.Preconditions.checkState(Preconditions.java:512) ~[armeria-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.doSendHeaders(AbstractServerCall.java:464) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.lambda$sendHeaders$3(AbstractServerCall.java:454) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.common.DefaultContextAwareRunnable.run(DefaultContextAwareRunnable.java:45) ~[armeria-1.26.4.jar:?]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:413) [netty-transport-classes-epoll-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
  [armeria-common-worker-epoll-3-2] WARN  io.netty.util.concurrent.AbstractEventExecutor - A task raised an exception. 
 Task: com.linecorp.armeria.common.DefaultContextAwareRunnable@41a4ca0e
 java.lang.IllegalStateException: sendHeaders has not been called
at com.linecorp.armeria.internal.shaded.guava.base.Preconditions.checkState(Preconditions.java:512) ~[armeria-1.26.4.jar:?]
at com.linecorp.armeria.server.grpc.UnaryServerCall.doSendMessage(UnaryServerCall.java:131) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.server.grpc.UnaryServerCall.lambda$sendMessage$1(UnaryServerCall.java:122) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.common.DefaultContextAwareRunnable.run(DefaultContextAwareRunnable.java:45) ~[armeria-1.26.4.jar:?]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:413) [netty-transport-classes-epoll-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
  [armeria-common-worker-epoll-3-2] WARN  io.netty.util.concurrent.AbstractEventExecutor - A task raised an exception. Task: 
 com.linecorp.armeria.common.DefaultContextAwareRunnable@4ede737a
 java.lang.IllegalStateException: call already closed. status: Status{code=OK, description=null, cause=null}, exception: null
at com.linecorp.armeria.internal.shaded.guava.base.Preconditions.checkState(Preconditions.java:835) ~[armeria-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.doClose(AbstractServerCall.java:245) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.lambda$close$1(AbstractServerCall.java:227) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.common.DefaultContextAwareRunnable.run(DefaultContextAwareRunnable.java:45) ~[armeria-1.26.4.jar:?]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:413) [netty-transport-classes-epoll-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
 [armeria-common-worker-epoll-3-2] ERROR org.opensearch.dataprepper.GrpcRequestExceptionHandler - Unexpected 
 exception handling gRPC request
com.linecorp.armeria.common.stream.ClosedStreamException: received a RST_STREAM frame: CANCEL

@KarstenSchnitter
Copy link
Collaborator

@Megharaj-N this looks like a client side timeout. DataPrepper seems to be taken more time, than your client is giving it. The client is closing the connection before DataPrepper is able to respond. This causes a closed stream on the DataPrepper side. What is the reaction of your client in that case? Can you try to increase the timeout?

@harrybaker-srt
Copy link

Thank you @KarstenSchnitter & @dlvenable , Buffer related error has stopped now after increasing buffer size and reducing batch size in dataprepper config. However, we are still encountering other error as mentioned below. It seems to me that connection is getting dropped between opentelemetry and dataprepper and it has not completely resolved even after updating opentelemetry exporter's tls configuration as below.

      exporters:
        otlp/data-prepper:
          endpoint: data-prepper:218xx
          tls:
           insecure: true
           insecure_skip_verify: true

Error log snippet from dataprepper

  [armeria-common-worker-epoll-3-2] ERROR org.opensearch.dataprepper.GrpcRequestExceptionHandler - Unexpected 
  exception handling gRPC request
 com.linecorp.armeria.common.stream.ClosedStreamException: received a RST_STREAM frame: CANCEL
 [armeria-common-worker-epoll-3-2] WARN  io.netty.util.concurrent.AbstractEventExecutor - A task raised an exception. Task: 
 com.linecorp.armeria.common.DefaultContextAwareRunnable@4cd916e4
 java.lang.IllegalStateException: call is closed
at com.linecorp.armeria.internal.shaded.guava.base.Preconditions.checkState(Preconditions.java:512) ~[armeria-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.doSendHeaders(AbstractServerCall.java:464) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.lambda$sendHeaders$3(AbstractServerCall.java:454) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.common.DefaultContextAwareRunnable.run(DefaultContextAwareRunnable.java:45) ~[armeria-1.26.4.jar:?]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:413) [netty-transport-classes-epoll-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
  [armeria-common-worker-epoll-3-2] WARN  io.netty.util.concurrent.AbstractEventExecutor - A task raised an exception. 
 Task: com.linecorp.armeria.common.DefaultContextAwareRunnable@41a4ca0e
 java.lang.IllegalStateException: sendHeaders has not been called
at com.linecorp.armeria.internal.shaded.guava.base.Preconditions.checkState(Preconditions.java:512) ~[armeria-1.26.4.jar:?]
at com.linecorp.armeria.server.grpc.UnaryServerCall.doSendMessage(UnaryServerCall.java:131) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.server.grpc.UnaryServerCall.lambda$sendMessage$1(UnaryServerCall.java:122) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.common.DefaultContextAwareRunnable.run(DefaultContextAwareRunnable.java:45) ~[armeria-1.26.4.jar:?]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:413) [netty-transport-classes-epoll-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
  [armeria-common-worker-epoll-3-2] WARN  io.netty.util.concurrent.AbstractEventExecutor - A task raised an exception. Task: 
 com.linecorp.armeria.common.DefaultContextAwareRunnable@4ede737a
 java.lang.IllegalStateException: call already closed. status: Status{code=OK, description=null, cause=null}, exception: null
at com.linecorp.armeria.internal.shaded.guava.base.Preconditions.checkState(Preconditions.java:835) ~[armeria-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.doClose(AbstractServerCall.java:245) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.internal.server.grpc.AbstractServerCall.lambda$close$1(AbstractServerCall.java:227) ~[armeria-grpc-1.26.4.jar:?]
at com.linecorp.armeria.common.DefaultContextAwareRunnable.run(DefaultContextAwareRunnable.java:45) ~[armeria-1.26.4.jar:?]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:413) [netty-transport-classes-epoll-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
 [armeria-common-worker-epoll-3-2] ERROR org.opensearch.dataprepper.GrpcRequestExceptionHandler - Unexpected 
 exception handling gRPC request
com.linecorp.armeria.common.stream.ClosedStreamException: received a RST_STREAM frame: CANCEL

Hi @Megharaj-N, could you please let me know what your buffer/batch size was before and after? Experiencing the same issue on prepper 2.6.2.

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
Development

No branches or pull requests

4 participants