-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to set concurrent request processing #161
Comments
In version 0.1, Rocket sets the number of threads to be the number of cores on the machine. The number of threads cannot be changed. Presumably your machine has a single core. In the soon to be released v0.2, Rocket sets the default number of threads to be |
Thank you for the answer, |
I tried this out on my machine, and Rocket is properly serving requests from multiple workers. (Neat app, by the way!) Perhaps you can try to see how many workers Rocket is using by temporarily switching to the master branch? Just change your [package]
name = "rust-birkana-http"
version = "0.1.0"
authors = ["Nemec Lukas <lukas.nemec2@firma.seznam.cz>"]
[dependencies]
rocket = { git = "https://github.com/SergioBenitez/Rocket" }
rocket_codegen = { git = "https://github.com/SergioBenitez/Rocket" }
rust-birkana = "1.1.1"
serde = "0.8"
serde_derive = "0.8"
[dependencies.rocket_contrib]
git = "https://github.com/SergioBenitez/Rocket"
default-features = false
features = ["tera_templates"] Rocket will log how many workers it's using at launch: 🔧 Configured for development.
=> address: localhost
=> port: 8080
=> log: normal
=> workers: 12
=> [extra] template_dir: "templates/" |
Some quick bench-marking on your code:
|
Hmm, this is really strange behavior, I'll try to get wrk results tomorrow. I tried release version, no difference. I'll also try on another system - linux server, there may be some macos FS shenanginans... |
I tried your code on OS X 10.11, but I'm not seeing the behavior you describe. What happens if you try bumping the number of workers to something large like 50? Just add a |
Well, 50 workers does work. With 50 I can't reproduce the slowness. However, when I change back to 5 workers, I can still reproduce. Below are wrk results (both with 5 workers): debug
release
I tried both Firefox and Chrome, and they both have the same behavior. I'll try to record the behavior with some screen recorder... |
Can you hop on IRC or Matrix to talk about this a bit more? I'm fairly certain I know what's going on, but I'd like to have a bit of back-and-forth to confirm. There are links in the README on how to join the chat. |
Sure! Thank you for the help. Here is a video of the "slowness" https://drive.google.com/drive/folders/0Bwv_8TwNErXzRmRELThUTU1LOTA?usp=sharing You can see there the browser taking over 4s to start loading data from the server. |
We discovered it is Hyper's fault. |
OK, so I want to say that I had the same error/behavior while trying to load multiple javascript files. Rocket.toml : CPU : Intel i5-6600K (4) @ 4.400GHz Code : Some images to reflect the behavior :
As you can see, Rocket seems to be bottleneck somewhere (5sec to load a js file is huge). Increasing the number of workers to 50 as said here resolved my issue. Maybe this issue is already fixed with an other workaround, but didn't saw it. Thanks for reading. |
Increasing the number of workers also worked for me. On a multi-core machine, I didn't have any problem. On a single core machine, the TTFB would be 5+ seconds when the browser would make 3+ requests quickly. Manually setting the workers to 8 immediately cleared up the issue. |
I created simple HTTP service:
https://github.com/lunemec/rust-birkana-http
Just to try HTTP frameworks in rust. It performs OK, but can't handle concurrent requests. This can be seen on serving static files - when you open dev/release version in 2 browsers and load them at the same time, one of them waits for the other to finish loading resources.
I saw mention in #21 that there is some
workers
setting? I couldn't find any mention of it in the API docs or googling it. Does rocket have this option or is it up to me to create ThreadPool and somehow tie it into http serving?Thank you.
The text was updated successfully, but these errors were encountered: