Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add NamedPipe::set_buffer_size #1608

Conversation

Thomasdezeeuw
Copy link
Collaborator

Allows the user to control the size of the buffers used.

@vadixidav, @mlsvrts coul you try this branch. Specifically can you try a 1 >= byte write, followed by a small read, because after reading the code a bit more I'm not sure this will solve your problem(s).

Allows the user to control the size of the buffers used.
///
/// The default is *currently* 4kB (4096 bytes).
pub fn set_buffer_size(&mut self, capacity: usize) {
self.inner.default_buf_size.store(capacity, Release)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You aren't using the atomic to synchronize memory.

Suggested change
self.inner.default_buf_size.store(capacity, Release)
self.inner.default_buf_size.store(capacity, Relaxed)

@Thomasdezeeuw
Copy link
Collaborator Author

@vadixidav, @mlsvrts either of you try this? Did it fix your issues?

@vadixidav
Copy link

I can give it a try right now.

@vadixidav
Copy link

vadixidav commented Sep 10, 2022

I can confirm that this works @Thomasdezeeuw. I made this change to tokio: vadixidav/tokio@798a71f

This just sets every NamedPipe on windows to have a buffer size of 1.

I patched my local application and I immediately saw latencies go from 2 seconds down to imperceptible levels.

[patch.crates-io]
mio = { git = 'https://github.com/Thomasdezeeuw/mio.git', branch = 'issue#1582_named_pipe_set_buffer_size' }
tokio = { git = 'https://github.com/vadixidav/tokio.git', branch = 'mio#1608_hack' }

For clarification, I am only performing reads, no writes.

@mlsvrts
Copy link

mlsvrts commented Dec 7, 2022

I got around to testing this and can also improve performance with this change!

@Thomasdezeeuw
Copy link
Collaborator Author

Since this hasn't seen any activity and it's likely not the API we want I'm going to close this.

@Thomasdezeeuw Thomasdezeeuw deleted the issue#1582_named_pipe_set_buffer_size branch August 29, 2023 13:52
@vadixidav
Copy link

@Thomasdezeeuw There is still a bug currently when using serial ports on windows due to this issue. I would still propose that we change the default buffer size to 1, otherwise async/await operations don't work properly on windows. Essentially, on windows, even though you have received data, it is not given to you until a waiting period that is very long, and this makes it unusable for all use cases I have used it for thus far. Should we open a new ticket to track this bug?

@vadixidav
Copy link

There is another PR over here berkowski/mio-serial#38. Perhaps this is a better solution to the problem, but I am not sure if it is the right place to fix the issue or not. Any thoughts?

@Thomasdezeeuw
Copy link
Collaborator Author

@Thomasdezeeuw There is still a bug currently when using serial ports on windows due to this issue. I would still propose that we change the default buffer size to 1, otherwise async/await operations don't work properly on windows. Essentially, on windows, even though you have received data, it is not given to you until a waiting period that is very long, and this makes it unusable for all use cases I have used it for thus far. Should we open a new ticket to track this bug?

I don't think a buffer size of 1 makes sense for other cases of NamedPipe. But indeed let's open a ticket to keep track of this.

There is another PR over here berkowski/mio-serial#38. Perhaps this is a better solution to the problem, but I am not sure if it is the right place to fix the issue or not. Any thoughts?

We can try something like that. I don't have enough experience with Windows's named pipes to provide any useful insights though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants