Skip to content

ServerBenchmarkRoundH210

Tatsuhiro Tsujikawa edited this page Mar 16, 2014 · 2 revisions

Benchmarking HTTP/2 (h2-10) and SPDY Servers

We performed benchmark tests against HTTP/2 and SPDY servers.

We use 2 HTTP/2 servers: node-http2 and nghttpd. Both servers implement h2-10 draft version. The node-http2 server is basically server.js in examples directory of node-http2 repository, but we changed the cached file URL to /index.html which we use in benchmarking.

As a comparison to protocol and languages, we also performed benchmark against nginx 1.15.11 and node-spdy. Both servers implement SPDY version 3.1. We setup nginx.conf to enable SPDY for SSL/TLS connection and use maximum header compression level. The node-spdy server script is written based on node-http2 server.js and caches index.html just like node-http2 server we mentioned above.

We know that node is not good at serving static contents. The result should be viewed taking into account this. The benchmark results do not justify anything against each implementation. This is because results may change drastically by tuning server settings.

Each server is configured to run on a single core.

The h2load was used as a benchmark tool. The benchmark is done in the following command line options:

$ h2load -n100000 -c100 -m100 https://localhost:8443/index.html
-n100000
Request total 100,000 requests
-c100
100 concurrent client connections
-m100
100 maximum concurrent streams each connection.

The index.html is 612 bytes which comes with nginx.

Node.js version is 0.10.26.

The OS is Ubuntu 13.10 with everything updated.

The hardware we used is Intel(R) Core(TM) i7-4600U CPU @ 2.10GHz with 4 cores and 8GB of memory.

Results

We ran the benchmark 5 times for each server and took the median, except for node-http2, which we only ran once.

Server Time (sec) Requests/sec
nghttpd 1.309 76338
nginx 2.17 49566
node-spdy 51.695 1934
node-http2 80.43 1249

nghttpd is fastest, but be aware that nginx does a lot more jobs and fully featured web server everyone uses for production and nginx web site says that its SPDY implementation is still experimental.

For node servers, node-spdy is a lot faster than node-http2 despite of the similar setups. node-spdy server consumed 300MB of RAM (RES) in benchmarking. On the other hand, node-http2 server consumed 1.3GB and it was getting slower in the second run, which we canceled the test.

nginx.conf

#user  nobody;
worker_processes  1;

#error_log  logs/error.log;
#error_log  logs/error.log  notice;
#error_log  logs/error.log  info;

#pid        logs/nginx.pid;


events {
    worker_connections  16384;
}


http {
    include       mime.types;
    default_type  application/octet-stream;

    #log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
    #                  '$status $body_bytes_sent "$http_referer" '
    #                  '"$http_user_agent" "$http_x_forwarded_for"';

    #access_log  logs/access.log  main;

    sendfile        on;
    #tcp_nopush     on;

    #keepalive_timeout  0;
    keepalive_timeout  65;

    #gzip  on;

    server {
        listen       10080;
        server_name  localhost;

        #charset koi8-r;

        #access_log  logs/host.access.log  main;

        location / {
            root   html;
            index  index.html index.htm;
        }

        #error_page  404              /404.html;

        # redirect server error pages to the static page /50x.html
        #
        error_page   500 502 503 504  /50x.html;
        location = /50x.html {
            root   html;
        }

        # proxy the PHP scripts to Apache listening on 127.0.0.1:80
        #
        #location ~ \.php$ {
        #    proxy_pass   http://127.0.0.1;
        #}

        # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
        #
        #location ~ \.php$ {
        #    root           html;
        #    fastcgi_pass   127.0.0.1:9000;
        #    fastcgi_index  index.php;
        #    fastcgi_param  SCRIPT_FILENAME  /scripts$fastcgi_script_name;
        #    include        fastcgi_params;
        #}

        # deny access to .htaccess files, if Apache's document root
        # concurs with nginx's one
        #
        #location ~ /\.ht {
        #    deny  all;
        #}
    }


    # another virtual host using mix of IP-, name-, and port-based configuration
    #
    #server {
    #    listen       8000;
    #    listen       somename:8080;
    #    server_name  somename  alias  another.alias;

    #    location / {
    #        root   html;
    #        index  index.html index.htm;
    #    }
    #}


    # HTTPS server

    server {
       listen       18443 ssl spdy;
       server_name  localhost;

       spdy_headers_comp 9;

       ssl_certificate      server.crt;
       ssl_certificate_key  server.key;

       ssl_session_cache    shared:SSL:1m;
       ssl_session_timeout  5m;

       ssl_ciphers  HIGH:!aNULL:!MD5;
       ssl_prefer_server_ciphers  on;

       location / {
           root   html;
           index  index.html index.htm;
       }
    }

}

node-http2 server.js

var fs = require('fs');
var path = require('path');
var http2 = require('..');

var options = process.env.HTTP2_PLAIN ? {
  plain: true
} : {
  key: fs.readFileSync(path.join(__dirname, '/localhost.key')),
  cert: fs.readFileSync(path.join(__dirname, '/localhost.crt'))
};

// Passing bunyan logger (optional)
options.log = require('../test/util').createLogger('server');

// We cache one file to be able to do simple performance tests without waiting for the disk
var cachedFile = fs.readFileSync(path.join(__dirname, './index.html'));
var cachedUrl = '/index.html';

// Creating the server
var server = http2.createServer(options, function(request, response) {
  var filename = path.join(__dirname, request.url);

  // Serving server.js from cache. Useful for microbenchmarks.
  if (request.url === cachedUrl) {
    response.end(cachedFile);
  }

  // Reading file from disk if it exists and is safe.
  else if ((filename.indexOf(__dirname) === 0) && fs.existsSync(filename) && fs.statSync(filename).isFile()) {
    response.writeHead('200');

    // If they download the certificate, push the private key too, they might need it.
    if (response.push && request.url === '/localhost.crt') {
      var push = response.push('/localhost.key');
      push.writeHead(200);
      fs.createReadStream(path.join(__dirname, '/localhost.key')).pipe(push);
    }

    fs.createReadStream(filename).pipe(response);
  }

  // Otherwise responding with 404.
  else {
    response.writeHead('404');
    response.end();
  }
});

server.listen(process.env.HTTP2_PORT || 8080);

node-spdy spdy.js

var spdy = require('spdy');
var fs = require('fs');
var path = require('path');


var options = {
  key: fs.readFileSync(path.join(__dirname, '/localhost.key')),
  cert: fs.readFileSync(path.join(__dirname, '/localhost.crt'))
};

// Passing bunyan logger (optional)
options.log = require('../test/util').createLogger('server');

// We cache one file to be able to do simple performance tests without waiting for the disk
var cachedFile = fs.readFileSync(path.join(__dirname, './index.html'));
var cachedUrl = '/index.html';

// Creating the server
var server = spdy.createServer(options, function(request, response) {
  var filename = path.join(__dirname, request.url);

  // Serving server.js from cache. Useful for microbenchmarks.
  if (request.url === cachedUrl) {
    response.end(cachedFile);
  }

  // Reading file from disk if it exists and is safe.
  else if ((filename.indexOf(__dirname) === 0) && fs.existsSync(filename) && fs.statSync(filename).isFile()) {
    response.writeHead('200');

    fs.createReadStream(filename).pipe(response);
  }

  // Otherwise responding with 404.
  else {
    response.writeHead('404');
    response.end();
  }
});

server.listen(process.env.HTTP2_PORT || 8080);