Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support batching backends #76

Open
plaidfinch opened this issue Mar 25, 2021 · 0 comments
Open

Support batching backends #76

plaidfinch opened this issue Mar 25, 2021 · 0 comments
Labels
enhancement New feature or request performance

Comments

@plaidfinch
Copy link
Contributor

More efficient backends may wish to batch messages transparently. This can be enabled by adding a flush async method to Transmitter with a default implementation of doing nothing. Invoking recv-requiring methods on Chan will implicitly call flush first (or concurrently?) to ensure communication dependency ordering is preserved. It's also necessary to call flush implicitly before split, since the concurrent receiving half might be waiting for a response to something already sent. A new method on Chan should be added which manually invokes flush, too. Finally, close should be made an async function, and it should call flush prior to dropping the channel, but only if there are no other references to the Arc awaiting the Tx—i.e., the transmitter should be automatically flushed on close only when the session is not going to be resumed by an outer context.

For non-batching backends, no code changes are necessary. Uses of close will need to be changed to be async.

The primary use case for this would be in the serde backend, which could now support a BufferedSender/BufferedReceiver pair, parametrized by all the parameters of the underlying transport for the non-buffering variety, as well as a maximum buffer size (which, if exceeded, triggers an automatic flush).

@plaidfinch plaidfinch added the enhancement New feature or request label Mar 25, 2021
@sdleffler sdleffler added this to the 0.4 milestone Mar 27, 2021
@plaidfinch plaidfinch removed this from the 0.4 milestone Mar 30, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request performance
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants