Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extended API to support batched producer/consumer methods #53

Closed
ChristianWulf opened this issue Feb 22, 2015 · 5 comments
Closed

Extended API to support batched producer/consumer methods #53

ChristianWulf opened this issue Feb 22, 2015 · 5 comments

Comments

@ChristianWulf
Copy link

Could we save some volatiled index and array updates when using a batch/buffered add for a producer in a high throughput context? Optimally, we could just wrap/decorate an available SpSc queue and adapt the element adding behavior.

@nitsanw
Copy link
Contributor

nitsanw commented Feb 22, 2015

For Spsc you cannot avoid an ordered write to the array, otherwise you will lose required ordering. You can delay it to make a bunch of elements visible but I doubt there's much to win for Spsc.
On the mp/mc side you can limit contention and reduce overheads by batching inserts/reads but this comes at an increased risk of bubbles in the queue, so can hurt latency. You would have to test and see if it helps particular applications. It is an interesting option for the alternative interfaces set out under experimental.

On 22 Feb 2015, at 19:06, ChristianWulf notifications@github.com wrote:

Could we save some volatiled index and array updates when using a batch/buffered add for a producer in a high throughput context? Optimally, we could just wrap/decorate an available SpSc queue and adapt the element adding behavior.


Reply to this email directly or view it on GitHub.

@ChristianWulf
Copy link
Author

Yes. What you call delay is what I mean. Why do doubt an increase for spsc w.r.t. throughput? I have scenarios where 1,000-100,000 elements per sec are passed through a spsc queue.

@nitsanw
Copy link
Contributor

nitsanw commented Feb 23, 2015

I recommend you test and measure rather than speculate. On current hardware the SPSC queue has been measured to deliver throughput of 350-470M messages per second. 1K - 100K is not going to be an issue.
For SPSC the 'cost' of offer/poll is down to 2-3ns there's really very very little to shave and I doubt there's many real world use cases where an improvement will make a difference to an application. I'm very happy to be proved wrong though.
As I said, there's value here for the MP/MC cases. The interface needs to be considered carefully and it would be easier to deliver as part of new queues interfaces from experimental.
Note that a 'decorator' pattern doesn't really work in these cases as the decorator has no way to claim a batch of slots. This will require some internal knowledge/access so you are looking at extending existing classes.

@nitsanw nitsanw changed the title Batched producer decorator Extended API to support batched producer/consumer methods Apr 17, 2015
@nitsanw
Copy link
Contributor

nitsanw commented May 17, 2015

Batch produce/consume interfaces which leave out the batch size are harder to reason about. The issues I see are around commitment:

  • If consume cannot handle current element it is stuck, the element has already been removed and there's nowhere to return it to. It can notify the queue that it will be unable to handle the next element though.
  • We have a catch 22 with the producer, if we take an element from the producer we can't return it, but we may not be able to actually put it in the queue. If we first claim and then take from the producer we may now find the producer has nothing to give us.

In either case it is find when the queue can only fulfil a part of the declared batch because the caller has committed to either being able to produce sufficient elements to fill the claimed slots or to being able to consume the full size of the batch.

@nitsanw
Copy link
Contributor

nitsanw commented Oct 30, 2015

Resolved with MessagePassingQueue API

@nitsanw nitsanw closed this as completed Oct 30, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants