New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch receiving messages not returning expected amount when using the Message Receiver #441

Closed
quickskape opened this Issue Apr 12, 2018 · 2 comments

Comments

Projects
None yet
2 participants
@quickskape
Copy link

quickskape commented Apr 12, 2018

I may be misunderstanding how maxMessageCount is intended to work, could anyone shed any insight into the behavior I am experiencing.

I have a queue which regularly receives around 4 million messages within a 6 hour window, I am trying to reduce the latency between message being submitted and being completed.

Actual Behavior

  1. messageReciever.ReceiveAsync(100, TimeSpan.FromMinutes(1)) Is receiving batches ranging from 1 - 43 messages even when the queue has 10,0000+ messages waiting in the queue.

Expected Behavior

  1. Batches of 100 messages being received consistently until the queue has less than 100 messages.

Versions

  • OS platform and version: Windows 10
  • .NET Version: > .NET Core 2.0
  • NuGet package version or commit ID: Microsoft.Azure.ServiceBus, Version 3.0.0-preview-01

Additional Info

Queue Setup:

  • Enable Partitioning: false
  • Maximum size: 5 GB
  • Lock Duration: 60 Seconds
  • Messaging tier: Standard
@nemakam

This comment has been minimized.

Copy link
Member

nemakam commented Apr 12, 2018

@quickskape
This is a known issue which is also by design as of now.
We are constantly thinking of ways to fix this, but there are multiple issues that come into picture. AMQP is a streaming protocol. It is not great with batching, and this is something that we need to build on top of it. There is always a trade off on the client side with how long it should wait for to collect the messages being streamed. Higher latency v/s getting bigger batch of messages.
I'm going to close this issue as a by-design, but we are constantly thinking about ways to improve it.

Also, if you are looking at improving perf, playing around with prefetchCount can be very valuable here.

@nemakam nemakam closed this Apr 12, 2018

@quickskape

This comment has been minimized.

Copy link

quickskape commented Apr 16, 2018

Thanks for the help, I am testing an implementation where I have increased the prefetchCount to around 3 times that of the maxMessageCount, this seems to allow the buffer to fill between batches being received and completed. The batch size is now pretty much always that of specified maxMessageCount.

Thanks again 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment