Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrk stress testing #8

Closed
proyb6 opened this issue Nov 27, 2016 · 1 comment
Closed

Wrk stress testing #8

proyb6 opened this issue Nov 27, 2016 · 1 comment

Comments

@proyb6
Copy link

proyb6 commented Nov 27, 2016

Need improvement to handle load, would it be idea to participate the Swift Server API and consider using their HTTP with Node.js HTTP Parser (C language) for better performance than Node.js itself if the team decide to choose that.

Error output during wrk testing:


wrk -d10 -c100 -t2 http://localhost:1337/
Running 10s test @ http://localhost:1337/
  2 threads and 100 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    68.09ms   22.21ms 132.56ms   69.66%
    Req/Sec   370.48     68.96   560.00     65.00%
  7379 requests in 10.01s, 1.50MB read
  Socket errors: connect 0, read 795, write 1, timeout 0
Requests/sec:    737.34
Transfer/sec:    152.98KB
@helje5
Copy link
Member

helje5 commented Nov 27, 2016

That 'issue' is less a 'proper issue' and more like a question, but let me address it nevertheless.

First of all: We meant it when we wrote:

Not for production. Consider the version numbers.

(Also: make sure to configure the listen-queue to your needs if you do a test like that ...)

There are a few reasons why the performance of Noze.io is not particularly high at the moment. The focus is still on producing a sound, generic, streaming API. Performance can follow.

Yes, we should use the upstream C http_parser, I've created issue #9 for this. The primary (essentially sole) reason we are not is to ease the setup of the dev environment. See the issue for more details.

Another major reason it is slower than necessary is issue #5. This is rooted in limitations of the Swift 3 generics implementation. Presumably this is going to get fixed in Swift 4, but we should still add a workaround as described in #5. Again: speed is not the most important goal for now, but a sound API.

Yet another reason is that Swift can't optimise across modules (generics are a particular issue here) and Noze.io is pretty heavily modularised (not as much as Node.js but much more than most other 'fat' Swift modules).

Then there is still the question whether GCD scales well enough for server loads. It may need to be dropped and replaced with something else (libuv).

etc

Summary: Our goal at this stage is not to accomplish the best possible speed. An important thing to understand is that this is a generic streaming library and not a framework focused on just HTTP (like many others). In the long run it should in superior scalability due to the way Noze.io applications are architectured, but we'll see :-)

To address your remaining points:

would it be idea to participate the Swift Server API

We do watch the effort and are quite interested to see what comes out of it. I think if the effort does its job well it'll likely make Noze.io kinda superfluous.

Remember that Noze.io is less about its express module and much more about the streaming I/O.

consider using their HTTP

IMHO this likely is going to buy us very little, but we'll see. There is nothing wrong with just using the C http_parser today.

I hope this gives you a little insight on where we are going. I'm going to close this issue for now, there should be specific issues like #9 or #5 which target specific performance issues, this one is a little too generic.

@helje5 helje5 closed this as completed Nov 27, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants