Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

slow performance using basic-proxy.js vs. no proxy #305

Open
robertlmullen74 opened this Issue · 16 comments

6 participants

@robertlmullen74

Hey all,
I have been running some ab tests against the basicproxy.js service and have noticed it's about 3 times slower using the proxy than not. I'm on node 0.8.8 and ubuntu 11.10.

Using proxy

ab -c 10 -n 1000 http://127.0.0.1:8000/

Total response time was approx. 4 seconds

No proxy

ab -c 10 -n 1000 http://127.0.0.1:9000/
Total response time was approx. 1.7 seconds

I ran the proxy with node http debugging enabled for a single request and this was the output.
NODE_DEBUG=http node basic-proxy.js

1 request through proxy
HTTP: SERVER new http connection
HTTP: server response shouldKeepAlive: false
I'm proxyrequest
get base result
HTTP: outgoing message end.
HTTP: SERVER new http connection
HTTP: server response shouldKeepAlive: true
HTTP: write ret = true
HTTP: outgoing message end.
HTTP: AGENT incoming response!
HTTP: AGENT isHeadResponse false
HTTP: write ret = true
HTTP: AGENT socket keep-alive
HTTP: outgoing message end.
HTTP: server socket close
HTTP: server socket close

1 request without proxy
HTTP: SERVER new http connection
HTTP: server response shouldKeepAlive: false
HTTP: write ret = true
HTTP: outgoing message end.
HTTP: server socket close

Is the performance difference simply because we're asking the proxy to do that much more, that 3x increase seems like something is askew here. I tested this with different ab parameters, played around with the maxSockets settting and tried a different http agent, but nothing seemed to speed up going through the proxy. I'm surprised at the discrepancy here, am I missing something?

Thanks,
Rob

@robertlmullen74

So I identified the majority of the bottleneck is in the dns lookup call done from in the node.js source from http.js > net.js > dns.js. I added some debug statements to the node.js source and found this:

DEBUG - http.js socket write ms:796ms
DEBUG - 2012-09-24T18:52:28.797Z simpleproxy-withlisteners.js:35 (ClientRequest.) socket assigned to backend request!
DEBUG - net.js lookup dns callback ms:880ms

That's 84ms of what was a 100ms request just in processing the dns lookup. Node is doing no caching of this either, so you take that hit every time.

I created this test script to help narrow down where the
https://gist.github.com/3834431

@indexzero
Owner

Interesting.

@Rush

Is there any fix for this?

@zlaoz

same issue here. ab-client machine and http-proxy machine are both Ubuntu 13.10 Servers. is used the basic-proxy.js example code:

without proxy: ab -n 10000 -c 100 http://10.10.10.1:9003/ :
Requests per second: 1975.13 #/sec
Time per request: 50.630 ms

with http-proxy: ab -n 10000 -c 100 http://10.10.10.1:8003/ :
Requests per second: 371.78 #/sec
Time per request: 268.975 ms

setNoDelay(); is set to true AND http.globalAgent.maxSockets is set to 10000 AND further more i did some linux-kernel tuning mentioned here: http://urbanairship.com/blog/2010/09/29/linux-kernel-tuning-for-c500k/

but without any better results. even a local benchmark returns the same result.

AND it does not seem to be an prob with the DNS resolving, i used this script https://gist.github.com/gabrielf/7746695#file-nodejs-dns-vs-ip-lookup-test-js by (https://gist.github.com/gabrielf) and the results looked fine:

lookup of 10.10.10.1: 1ms
lookup of 10.10.10.1: 0ms
lookup of 10.10.10.1: 0ms

@grantkl

@zlaoz That looks similar to the 200ms fixed overhead from small requests I saw in anything greater than Node v0.10.15.

@zlaoz

@grantkl so in an older node version this delay does not took place? which version? (i am on v0.10.28)

@zlaoz

same with node v0.10.10

@grantkl

I didn't see the same delay in v0.10.15 or before. There were changes in v0.10.16 that introduced a 200ms hit for my use case. It has been a couple of months since I looked into this as I started vetting other solutions for my problem.

@jcrugzz
Owner

@zlaoz i would personally try using wrk to benchmark as ab sends http 1.0 requests rather than http 1.1 iirc. See if the issue is reproducible there. Otherwise I would try passing in an agent to http-proxy as it does not use the global agent by default, it uses no agent.

@zlaoz

@jcrugzz thanks for the hints. with wrk the results are the same:

Running 10s test @ http://10.10.10.1:9003/ (direct)
1 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 40.56ms 2.25ms 62.01ms 88.99%
Req/Sec 2.47k 99.67 2.54k 87.34%
24376 requests in 10.00s, 4.95MB read
Requests/sec: 2437.32
Transfer/sec: 506.56KB

Running 10s test @ http://10.10.10.1:8003/ (http-proxy)
1 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 256.17ms 20.93ms 297.68ms 65.28%
Req/Sec 390.72 44.85 500.00 72.22%
3900 requests in 10.01s, 0.86MB read
Requests/sec: 389.60
Transfer/sec: 88.27KB

passing an agent to http-proxy is the next thing... hold on

@jcrugzz
Owner

@zlaoz and when passing in an agent make sure it has maxSockets set reasonably high as well.

@zlaoz

same result using:

var agent = new http.Agent();
agent.maxSockets = 10000;

var server = httpProxy.createServer({
agent: agent,
target: 'http://localhost:9003'
}).listen(8003);

@jcrugzz
Owner

@zlaoz hmm and this is on ubuntu only? Do you have the code that reproduces this that i can grab and test myself? I want to see if its reproducible on OS X. I would also try and run this on 0.11.x to see if this issue may be fixed already for the next version (as it seems to be core related).

Paste the wrk command you are using as well if you could :).

@jcrugzz
Owner

@zlaoz ive tested your same code locally on my macbook and it shows an overhead of 60ms of latency with the proxy (7ms -> 60ms). Have you been able to try this on 0.11.13 and see if you get the same performance?

@zlaoz

@jcrugzz sry for the late response! tried it using 0.11.13 i got 50ms (direct) and 200ms (proxied). i tried it locally too and i got 12ms (direct) and 90ms (proxied).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.