Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

First pass at HTTP req/sec benchmark #996

Merged
merged 3 commits into from Oct 16, 2018
Merged

First pass at HTTP req/sec benchmark #996

merged 3 commits into from Oct 16, 2018

Conversation

ry
Copy link
Member

@ry ry commented Oct 15, 2018

Tests against node.

Ref #977
cc @alexhultman

tests/tcp_wrk.ts Outdated Show resolved Hide resolved
console.log("port", port);
http
.Server((req, res) => {
res.writeHead(200, { "Content-Length": "12" });
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can remove this line it is inferred

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed.

@ghost
Copy link

ghost commented Oct 15, 2018

Nice to see this taking shape, but you need to keep deno connections open

@ry ry force-pushed the wrk_benchmark branch 2 times, most recently from 590f5ca to c012b7a Compare October 15, 2018 22:54
@ry ry requested a review from piscisaureus October 15, 2018 23:11
.travis.yml Outdated Show resolved Hide resolved
tests/http_bench.ts Outdated Show resolved Hide resolved
tests/http_bench.ts Outdated Show resolved Hide resolved
@ry
Copy link
Member Author

ry commented Oct 16, 2018

Fortunately/unfortunately I'm seeing that Deno is faster than Node now at this benchmark:

> ./tools/http_benchmark.py out/release/deno
http_benchmark testing DENO.
Listening on 127.0.0.1:4544
third_party/wrk/mac/wrk -d 10s http://127.0.0.1:4544/
Running 10s test @ http://127.0.0.1:4544/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    32.31ms  128.08ms 861.36ms   94.27%
    Req/Sec     2.24k   438.85     2.65k    92.47%
  41665 requests in 10.01s, 2.03MB read
  Socket errors: connect 0, read 41664, write 0, timeout 0
Requests/sec:   4162.71
Transfer/sec:    207.32KB

http_benchmark testing NODE.
port 4544
third_party/wrk/mac/wrk -d 10s http://127.0.0.1:4544/
Running 10s test @ http://127.0.0.1:4544/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.47ms  524.32us  14.80ms   92.22%
    Req/Sec     3.05k   511.42     3.61k    86.79%
  32230 requests in 10.00s, 3.84MB read
Requests/sec:   3221.81
Transfer/sec:    393.29KB

So, I will update it to do keep-alive.

@ry
Copy link
Member Author

ry commented Oct 16, 2018

Using keep-alive (assuming it is doing the connection handling properly) I am also seeing deno faster than node:

> ./tools/http_benchmark.py out/release/deno
http_benchmark testing DENO.
Compiling /Users/rld/src/deno/tests/http_bench.ts
Listening on 127.0.0.1:4544
third_party/wrk/mac/wrk -d 10s http://127.0.0.1:4544/
Running 10s test @ http://127.0.0.1:4544/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   699.92us    0.96ms  21.33ms   96.02%
    Req/Sec     8.84k     0.98k    9.88k    89.00%
  175851 requests in 10.00s, 12.58MB read
Requests/sec:  17583.11
Transfer/sec:      1.26MB

http_benchmark testing NODE.
port 4544
third_party/wrk/mac/wrk -d 10s http://127.0.0.1:4544/
Running 10s test @ http://127.0.0.1:4544/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   696.01us  295.58us  12.66ms   96.08%
    Req/Sec     7.33k     1.05k    7.87k    92.57%
  147273 requests in 10.10s, 18.26MB read
Requests/sec:  14581.40
Transfer/sec:      1.81MB

tests/http_bench.ts Outdated Show resolved Hide resolved
tools/node_http.js Outdated Show resolved Hide resolved
@ry
Copy link
Member Author

ry commented Oct 16, 2018

On travis, I get something like this (previous benchmarks were from my mac laptop)

Listening on 127.0.0.1:4544
third_party/wrk/linux/wrk -d 10s http://127.0.0.1:4544/
Running 10s test @ http://127.0.0.1:4544/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     0.93ms    1.51ms  20.56ms   94.59%
    Req/Sec     7.85k     1.04k   15.92k    87.06%
  157049 requests in 10.10s, 11.23MB read
Requests/sec:  15550.37
Transfer/sec:      1.11MB
http_benchmark testing NODE.
port 4544
third_party/wrk/linux/wrk -d 10s http://127.0.0.1:4544/
Running 10s test @ http://127.0.0.1:4544/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   588.21us  413.42us  11.46ms   96.68%
    Req/Sec     9.02k     1.33k   18.88k    93.53%
  180468 requests in 10.10s, 22.37MB read
Requests/sec:  17868.22
Transfer/sec:      2.22MB

seems good. will land now.

@ghost
Copy link

ghost commented Oct 16, 2018

Nice to see this coming together, however some reminders:

Node.js score 2x if you replace the HTTP module with a similarly simple net placeholder implementation:

const net = require('net')

let k = Buffer.from("HTTP/1.1 200 OK\r\nContent-Length: 12\r\n\r\nHello World!");

net.createServer(function (socket) {
  socket.on('data', function(data) {
    socket.write(k);
  })
}).listen(3000)

For me this goes from 20k to 40k.

Also, remember that Deno is not really strictly single-threaded in CPU-time utilization but more of a 1.6x consumer. Normalizing the numbers with this in mind, together with the much simpler Node.js net implementaiton, Deno is not any more efficient than Node.js in fact they differ very much.

@ry ry merged commit c61a0f2 into denoland:master Oct 16, 2018
@ry
Copy link
Member Author

ry commented Oct 16, 2018

@alexhultman yep - Deno is clearly slower. We will pick the optimization tree now.

@ghost
Copy link

ghost commented Oct 16, 2018

👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants