New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance Issues with Envoy #5536

Open
voley55 opened this Issue Jan 8, 2019 · 2 comments

Comments

Projects
None yet
4 participants
@voley55
Copy link

voley55 commented Jan 8, 2019

Benchmarking Envoy and comparing performance against HAProxy

Setup:
LB : Envoy/HAProxy
Backend : Nginx
Benchmarking Tool : wrk (https://github.com/wg/wrk)

Envoy Config:
Concurrency = 4

static_resources:
  listeners:
  - name: test
    address:
      socket_address:
        protocol: TCP
        address: 0.0.0.0
        port_value: 8090
    filter_chains:
    - filters:
      - name: envoy.http_connection_manager
        config:
          stat_prefix: ingress_http
          generate_request_id: false
          route_config:
            name: test_routes
            virtual_hosts:
            - name: test_service
              domains:
              - "*"
              routes:
              - match:
                  prefix: "/"
                route:
                  cluster: test_backend
          http_filters:
          - name: envoy.router
            config:
              dynamic_stats: false
  clusters:
  - name: test_backend
    connect_timeout: 0.25s
    hosts:
    - socket_address:
        address: 172.16.x.x
        port_value: 8000

HAProxy Config:

global
  daemon
  maxconn 10000
  nbproc 4
defaults
  mode http
  timeout connect 5000ms
  timeout client 50000ms
  timeout server 50000ms
  http-reuse aggressive
frontend test
  bind *:8080
  acl test1 path_beg /
  use_backend test_backend if test1
backend test_backend
  server server1 172.16.x.x:8000

Nginx Config

worker_processes 4;
worker_connections 1000;
server {
    listen 8000 default_server;

    server_name test.com;
    access_log /var/log/nginx/test.access.log;
    error_log  /var/log/nginx/test.error.log;

    location / {
        return 200 'Woohoo!';
    }
}

Benchmark Results

Envoy
$ wrk -c100 -d60s -t10 "http://172.16.x.x:8090/" --latency

Running 1m test @ http://172.16.x.x:8090/
  10 threads and 100 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     2.73ms    5.94ms 413.43ms   99.73%
    Req/Sec     4.00k     1.43k    9.36k    63.22%
  Latency Distribution
     50%    2.16ms
     75%    3.56ms
     90%    4.50ms
     99%    7.10ms
  2388909 requests in 1.00m, 389.58MB read
Requests/sec:  39748.81
Transfer/sec:      6.48MB

HAProxy
$ wrk -c100 -d60s -t10 "http://172.16.x.x:8080/" --latency

Running 1m test @ http://172.16.x.x:8080/
  10 threads and 100 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.14ms  526.36us  31.04ms   89.20%
    Req/Sec     8.89k     1.79k   14.23k    63.36%
  Latency Distribution
     50%    1.05ms
     75%    1.32ms
     90%    1.63ms
     99%    2.20ms
  5315577 requests in 1.00m, 729.98MB read
Requests/sec:  88446.34
Transfer/sec:     12.15MB

Note:

  • Both the lbs are running inside docker in host networking mode.
  • In case of envoy benchmark, backend (nginx) CPU utilisation only reaches 60%

Could you please point out where am I going wrong? (As according to various online blogs, envoy seems to provide equivalent performance as HAProxy)

@brian-pane

This comment has been minimized.

Copy link
Contributor

brian-pane commented Jan 13, 2019

I just ran a quick test and got similar results. I ran a CPU profiler on Envoy during the test, and it highlighted part of the reason for the performance difference: wrk uses HTTP/1.1, and Envoy uses a relatively CPU-intensive HTTP/1 parser. HTTP/1 parser performance is a longstanding issue; #2880 has some context.

@bedis

This comment has been minimized.

Copy link

bedis commented Feb 6, 2019

What haproxy version?
Haproxy config is not even optimal :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment