-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[exporterhelper] Awkwardness due to API between queue sender and batch sender #10368
Comments
We still need to limit concurrency in the queue or batch configuration. Otherwise, the queue becomes useless. From the user configuration perspective, I think we should provide: batcher:
# The maximum number of batches that can be exported concurrently.
# This can only be used with enabled sending_queue.
max_concurrency: If this option is defined and both sending_queue and batcher are enabled, the As you mentioned, this will complicate the contract between the queue and batcher senders. I think we can move the queue consumers into some sort of another component which will be replaced by the batcher if |
@moh-osman3 and I studied the same problem -- we want to avoid use of a queue_sender but we want concurrency in the batcher. I reason that another user will come along asking not to use the queue sender, not to use the batch sender, and still want concurrency. Therefore, I propose we try to (and @moh-osman3 has been investigating how to) add a new sender stage purely for concurrency control. In such a future, the existing queue_sender num_consumers I think should dictate how many workers perform the cpu and I/O intensive work of gathering items from the queue, not dictate how many export threads are used. |
@dmitryax @carsonip This PR #10478 shows an implementation of the concurrency_sender which will limit the number of goroutines used for export. This removes the concurrenyLimit from the batch_sender so that batch exports are only triggered by size/timeout and the concurrency is limited by the new sender independent of queue_sender.num_consumers. Any thoughts on this approach? |
@jmacd, I just wanted to clarify that the existing implementation of the batcher sender is concurrent. There is no limit if queue sender is not enabled. This issue is about limiting the concurrency of the batcher sender. |
Is your feature request related to a problem? Please describe.
Related to batch sender in #8122
Currently, queue sender and batch sender work together fine, but the config
sending_queue.num_consumers
and batch senderconcurrencyLimit
make it a little awkward to use. The awkwardness comes from the fact that it is impossible for batch sender to get more requests than queue sender's num_consumers, and may be forced to export a smaller batch due to it.I am also not sure if this presents a performance problem at scale, which means a lot of goroutines will be involved.
The batch sender concurrency check has also been prone to unfavorable goroutine scheduling, as reported in #9952 .
Describe the solution you'd like
Ideally, queue sender can keep sending to batch sender without an artificial limit, such that batching is only triggered by min_size_items and flush_timeout.
I don't have an idea how to implement this yet. Also, this may require a change in contract between exporterhelper parts.
The text was updated successfully, but these errors were encountered: