Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Low Performance on Parallel request #1335

Closed
feters opened this issue Mar 30, 2022 · 4 comments
Closed

Low Performance on Parallel request #1335

feters opened this issue Mar 30, 2022 · 4 comments

Comments

@feters
Copy link

feters commented Mar 30, 2022

Hello, I am Feter, a Master's student at Lulea University.
I did a local test on Docker container using Neqo Server against Pico Client and the result is surprisingly low compared to other QUIC server implementations as far as I know Rust is a performance language. I used Ubuntu 20.04 image for the server and client. This is the result and how I did run the test:

Run Neqo Server
ionice -c 2 -n 0 nice -n -20 env RUST_LOG=trace cargo run --bin neqo-server -- [::]:7733 --db ./test-fixture/db --qlog-dir server_qlog

Test Case
Transfer file of 1GB in parallel 10 times

Parallel Request 10 50 100
Throughput (Mbps ) 179.468 23.564 15.011
RTT (Seconds) 1.79 8.043 13.074
Transfer Time (Seconds) 7.622 50.988 75.424

Are there any extra things I can do to increase the performance?
Thank you.

@martinthomson
Copy link
Member

Rust might have low overheads, but it's possible that our code does not.

One major factor here is likely that our server is single-threaded. It exists primarily so that we are able to test our client; no significant effort has been put into making the server performant.

@junhochoi
Copy link

junhochoi commented Apr 2, 2022

@feters I think you can try again without RUST_LOG=trace and --qlog-dir option for less verbosity and add --release to cargo run and see the difference.

@feters
Copy link
Author

feters commented Apr 2, 2022

@martinthomson thank you
@junhochoi Okay I will try that, thank you very much

@larseggert
Copy link
Collaborator

Closing. @feters please reopen if this is still an issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants