-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
benchmark: v0.4, made a benchmark for apisix
to determine if there is a performance issue
#36
Comments
apisix
to determine if there is a performance issue
1 route + 1 upstream + 2 plugin(limit-conn + prometheus) # route information
$ curl http://127.0.0.1:2379/v2/keys/apisix/routes/1 -X PUT \
-d value='{"methods":["GET"],"uri":"/hello","id":1,"plugin_config":{"limit-count":{"count":200000000,"time_window":60,"rejected_code":503,"key":"remote_addr"},"prometheus":{}},"upstream":{"type":"roundrobin","nodes":{"127.0.0.1:80":1,"127.0.0.2:80":1}}}' $ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 732.47us 159.15us 2.62ms 89.17%
Req/Sec 6.78k 460.18 7.50k 67.65%
Latency Distribution
50% 685.00us
75% 757.00us
90% 0.90ms
99% 1.39ms
68815 requests in 5.10s, 16.21MB read
Requests/sec: 13492.64
Transfer/sec: 3.18MB
$ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 770.87us 141.37us 3.86ms 85.88%
Req/Sec 6.44k 335.48 7.17k 70.59%
Latency Distribution
50% 734.00us
75% 812.00us
90% 0.92ms
99% 1.26ms
65363 requests in 5.10s, 15.39MB read
Requests/sec: 12815.03
Transfer/sec: 3.02MB |
better version with this commit: $ curl http://127.0.0.1:2379/v2/keys/apisix/routes/1 -X PUT \
-d value='{"methods":["GET"],"uri":"/hello","id":1,"plugin_config":{"limit-count":{"count":200000000,"time_window":60,"rejected_code":503,"key":"remote_addr"},"prometheus":{}},"upstream":{"type":"roundrobin","nodes":{"127.0.0.1:80":1,"127.0.0.2:80":1}}}'
$ wrk -d 500 --latency http://127.0.0.1:9080/hello
Running 8m test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
^C Thread Stats Avg Stdev Max +/- Stdev
Latency 680.99us 126.71us 1.91ms 83.71%
Req/Sec 7.30k 438.79 8.21k 72.73%
Latency Distribution
50% 654.00us
75% 718.00us
90% 823.00us
99% 1.20ms
31941 requests in 2.20s, 7.52MB read
Requests/sec: 14519.97
Transfer/sec: 3.42MB
$ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 663.87us 148.15us 2.51ms 87.31%
Req/Sec 7.49k 0.92k 14.09k 87.13%
Latency Distribution
50% 634.00us
75% 697.00us
90% 823.00us
99% 1.25ms
75269 requests in 5.10s, 17.73MB read
Requests/sec: 14758.49
Transfer/sec: 3.48MB # nothing plugin was enabled
$ curl http://127.0.0.1:2379/v2/keys/apisix/routes/1 -X PUT \
-d value='{"methods":["GET"],"uri":"/hello","id":1,"plugin_config":{},"upstream":{"type":"roundrobin","nodes":{"127.0.0.1:80":1,"127.0.0.2:80":1}}}'
$ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 600.46us 163.78us 3.90ms 91.57%
Req/Sec 8.28k 726.78 9.39k 72.00%
Latency Distribution
50% 556.00us
75% 613.00us
90% 735.00us
99% 1.29ms
82384 requests in 5.00s, 14.37MB read
Requests/sec: 16474.20
Transfer/sec: 2.87MB
$ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 623.03us 201.04us 4.50ms 87.49%
Req/Sec 7.99k 1.03k 15.15k 79.21%
Latency Distribution
50% 557.00us
75% 655.00us
90% 842.00us
99% 1.40ms
80322 requests in 5.10s, 14.01MB read
Requests/sec: 15752.11
Transfer/sec: 2.75MB |
commit: fe02056 + patch: https://github.com/iresty/lua-var-nginx-module # 1upstream + 1 route + 2 plugin (limit-count + prometheus)
$ curl http://127.0.0.1:2379/v2/keys/apisix/routes/1 -X PUT \
-d value='{"methods":["GET"],"uri":"/hello","id":1,"plugin_config":{"limit-count":{"count":200000000,"time_window":60,"rejected_code":503,"key":"remote_addr"},"prometheus":{}},"upstream":{"type":"roundrobin","nodes":{"127.0.0.1:80":1,"127.0.0.2:80":1}}}'
$ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 641.54us 158.44us 3.64ms 93.50%
Req/Sec 7.78k 515.63 8.71k 58.82%
Latency Distribution
50% 603.00us
75% 672.00us
90% 756.00us
99% 1.24ms
78919 requests in 5.10s, 18.59MB read
Requests/sec: 15475.07
Transfer/sec: 3.64MB
$ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 693.14us 142.28us 3.66ms 89.51%
Req/Sec 7.20k 396.29 8.10k 75.00%
Latency Distribution
50% 660.00us
75% 731.00us
90% 816.00us
99% 1.19ms
71566 requests in 5.00s, 16.85MB read
Requests/sec: 14312.39
Transfer/sec: 3.37MB # 1route + 1 upstream + no plugin
$ curl http://127.0.0.1:2379/v2/keys/apisix/routes/1 -X PUT -d value='{"methods":["GET"],"uri":"/hello","id":1,"plugin_config":{},"upstream":{"type":"roundrobin","nodes":{"127.0.0.1:80":1,"127.0.0.2:80":1}}}'
$ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 567.92us 113.97us 3.95ms 92.04%
Req/Sec 8.69k 476.91 9.31k 68.63%
Latency Distribution
50% 542.00us
75% 584.00us
90% 656.00us
99% 0.99ms
88200 requests in 5.10s, 15.39MB read
Requests/sec: 17293.04
Transfer/sec: 3.02MB
$ wrk -d 5 --latency http://127.0.0.1:9080/hello
Running 5s test @ http://127.0.0.1:9080/hello
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 557.94us 88.09us 2.58ms 87.68%
Req/Sec 8.82k 420.66 9.71k 76.47%
Latency Distribution
50% 533.00us
75% 583.00us
90% 650.00us
99% 0.90ms
89496 requests in 5.10s, 15.61MB read
Requests/sec: 17548.13
Transfer/sec: 3.06MB |
Benchmark on google cloud(1 route): Machine type n1-highcpu-8 (8 vCPUs, 7.2 GB memory) https://gist.github.com/membphis/9e7e3799c6bc83156faf2f7fe2fade7e Flame graph: |
Closed
apisix
to determine if there is a performance issueapisix
to determine if there is a performance issue
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
1 route with same upstream
10 routes with same upstream
100 routes with same upstream
1000 routes with same upstream
The text was updated successfully, but these errors were encountered: