Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Destroy socket on unsuccessful CONNECT #46

Closed
wants to merge 1 commit into from

Conversation

danypype
Copy link

@danypype danypype commented Feb 7, 2019

When the proxy server replies with statusCode !== 200 to the CONNECT request the socket is left half open and sits in CLOSE_WAIT state until the process quits.

This change makes sure sockets are destroyed when the proxy server is unable to establish a connection with the upstream server.

Here's a simple test to illustrate the problem:

const tunnel = require('.');
const http = require('http');
const EventEmitter = require('events');
const { exec } = require('child_process');

const HOST = '127.0.0.1';
const PORT = 3000;
const proxy = {
  host: HOST,
  port: PORT,
};

const server = http.createServer();

server.on('connect', function (req, socket, head) {
  socket.end('HTTP/1.1 502 Bad Gateway\r\n\r\n');
});

server.listen(proxy, async function (error) {
  if (error) throw error;

  const agent = tunnel.httpsOverHttp({ proxy });

  async function createSocket() {
    await new Promise(function (resolve, reject) {
      const request = new EventEmitter();
      const options = {
        request,
        host: 'https://www.google.com',
        port: 443,
      };

      request.once('error', reject);
      agent.createSocket(options, resolve);
    });
  }

  for (let i = 1; i <= 10; i++) {
    try {
      console.log(`creating socket #${i}`);
      await createSocket();
    } catch (e) {
      console.error(e.message);
    }
  }

  exec('netstat -an | grep CLOSE_WAIT || true', function (error, output) {
    if (error) throw error;

    if (output.length) {
      console.log('sockets in CLOSE_WAIT state:');
      console.log(output);
    } else {
      console.log('no sockets in CLOSE_WAIT state');
    }

    server.close();
  });
});

Running from request/tunnel-agent:

~/tunnel-agent$ node test.js
creating socket #1
tunneling socket could not be established, statusCode=502
creating socket #2
tunneling socket could not be established, statusCode=502
creating socket #3
tunneling socket could not be established, statusCode=502
creating socket #4
tunneling socket could not be established, statusCode=502
creating socket #5
tunneling socket could not be established, statusCode=502
creating socket #6
tunneling socket could not be established, statusCode=502
creating socket #7
tunneling socket could not be established, statusCode=502
creating socket #8
tunneling socket could not be established, statusCode=502
creating socket #9
tunneling socket could not be established, statusCode=502
creating socket #10
tunneling socket could not be established, statusCode=502
sockets in CLOSE_WAIT state:
tcp4       0      0  127.0.0.1.65399        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65398        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65397        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65396        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65395        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65394        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65393        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65392        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65391        127.0.0.1.3000         CLOSE_WAIT 
tcp4       0      0  127.0.0.1.65390        127.0.0.1.3000         CLOSE_WAIT

Running from alltherooms/tunnel-agent:

~/tunnel-agent$ node test.js
creating socket #1
tunneling socket could not be established, statusCode=502
creating socket #2
tunneling socket could not be established, statusCode=502
creating socket #3
tunneling socket could not be established, statusCode=502
creating socket #4
tunneling socket could not be established, statusCode=502
creating socket #5
tunneling socket could not be established, statusCode=502
creating socket #6
tunneling socket could not be established, statusCode=502
creating socket #7
tunneling socket could not be established, statusCode=502
creating socket #8
tunneling socket could not be established, statusCode=502
creating socket #9
tunneling socket could not be established, statusCode=502
creating socket #10
tunneling socket could not be established, statusCode=502
no sockets in CLOSE_WAIT state

@Hamper
Copy link

Hamper commented Feb 8, 2019

What difference between this pr and #21 ? Which solution is better in your opinion?

@danypype
Copy link
Author

danypype commented Feb 8, 2019

@Hamper, good question. I think it depends on your use case. #21 adds the socket to the TunnelingAgent#sockets array, which allows for later reuse.

If the proxy in question keeps the connection with the client open in case of error - when a connection whit the upstream can't be established - then the client will be able to reuse this connection to issue a new CONNECT.

Keeping in mind the proxy might close its side of the connection with the client when the upstream connection can't be established it seems more reasonable to me making the client explicitly close its end of the connection with the proxy in this case.

@insightfuls
Copy link

insightfuls commented Jun 27, 2019

I've just found this bug after three weeks debugging why our services were being brought down by AppDynamics which uses the request module which uses this module, through our corporate proxy.

I believe this PR is the correct solution. #21 looks buggy to me. There is no expectation that sockets making CONNECT requests can be reused.

Is there any chance we could get this merged in and published?

I'm not sure who the maintainer is here. Is it @mikeal perhaps?

If assistance is required, I'm happy to volunteer.

@seiya-git
Copy link

@insightfuls, there is already exist repo with merged all pull requests
https://github.com/seiya-npm/tunnel-agent
this pull request will be merged in the evening to my fork

Hamper added a commit to seiya-npm/tunnel-agent that referenced this pull request Jun 29, 2019
@insightfuls
Copy link

Thanks @seiya-git, however it doesn't really do any good to the ecosystem until/unless it's published to NPM so modules which consume it (such as request) benefit from the fix, and ultimately services and applications. So I'm interested in how we might be able to make that happen.

@danypype danypype closed this Sep 22, 2022
@danypype danypype reopened this Sep 22, 2022
@danypype danypype closed this Sep 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants