Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

Set headers prior to calling writeHead() (fixes #316)

This allows the connect.compress() middleware to work on proxied
requests.

If this header is not manually set then the compress middleware
will be invoked in the writeHead() call:

  https://github.com/nodejitsu/node-http-proxy/blob/1df2b30e84f078d86e744601bd6fc59381a1c7b3/lib/node-http-proxy/http-proxy.js#L260

before the header has been written. Therefore it will perform its
content-type check:

  https://github.com/senchalabs/connect/blob/7b5621b8dd4d9d82296b9b5d7bf3bea10de37a6c/lib/middleware/compress.js#L94

and so the check will fail and compression will be disabled.

Futhermore, even if that check were bypassed, compress() would be
unable to delete the content-length header:

  https://github.com/senchalabs/connect/blob/7b5621b8dd4d9d82296b9b5d7bf3bea10de37a6c/lib/middleware/compress.js#L123

because by then it wouldn't exist (it will be written after that
header event has been processed by compress). The affect of that
bug is that connections with keep-alive will stall for several
minutes because the browser is waiting for more data (based on
the orignal content-length header, set prior to compression).
  • Loading branch information...
commit 8c999c0f95515f0a95ef9a1d38df80cf63c445ef 1 parent 1df2b30
@fmarier authored
Showing with 6 additions and 1 deletion.
  1. +6 −1 lib/node-http-proxy/http-proxy.js
View
7 lib/node-http-proxy/http-proxy.js
@@ -257,7 +257,12 @@ HttpProxy.prototype.proxyRequest = function (req, res, buffer) {
}
// Set the headers of the client response
- res.writeHead(response.statusCode, response.headers);
+ for (var key in response.headers) {
+ if (response.headers.hasOwnProperty(key)) {
+ res.setHeader(key, response.headers[key]);
+ }
+ }
+ res.writeHead(response.statusCode);
// If `response.statusCode === 304`: No 'data' event and no 'end'
if (response.statusCode === 304) {
Please sign in to comment.
Something went wrong with that request. Please try again.