Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: socket hang up in new relic library. #75

Closed
elliotstokes opened this issue Oct 30, 2013 · 16 comments
Closed

Error: socket hang up in new relic library. #75

elliotstokes opened this issue Oct 30, 2013 · 16 comments

Comments

@elliotstokes
Copy link

We appear to be getting this error in our new relic logs that looks like its coming from the new relic library. Below is the stack trace:

Error: socket hang up

at createHangUpError (http.js:1445:15)
at CleartextStream.socketCloseListener (http.js:1495:23)
at CleartextStream.EventEmitter.emit (events.js:117:20)
at tls.js:677:10
at /var/app/current/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:177:31
at process._tickDomainCallback (node.js:459:13)
at process.<anonymous> (/var/app/current/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/index.js:18:15)
@othiym23
Copy link
Contributor

What version of Node are you running, and what version of the agent? There's a problem under node 0.8 and recentish versions of one of the New Relic module's dependencies that causes errors that were being handled one way to turn into unrecoverable crashers.

If this isn't crashing your app (and if it is, I apologize, because New Relic tries hard to avoid crashers), are you sure these errors aren't just ordinary failed SSL connection attempts? When you're running the New Relic module, you're going to see async-listener and continuation-local-storage showing up in stack traces in places that are completely unrelated to New Relic (which is what appears to be the case here). These are fairly low-level modules that wrap core node functionality to do their thing.

Either way, let me know what you find, and thanks for the report!

@rodrigok
Copy link

rodrigok commented Nov 3, 2013

This error was occuring to me too. I'm using version 0.10.21 of node.

@othiym23
Copy link
Contributor

othiym23 commented Nov 3, 2013

@rodrigok, is this causing your app to crash? As I said in my response to @elliotstokes, I'd like to figure out if New Relic is just showing up in the stacktrace (which it's going to do, because of how the module's instrumentation works), or if it's actually either causing problems or turning a recoverable error into a fatal crash. As it stands, I still don't know if there's something here that New Relic needs to fix.

@rodrigok
Copy link

rodrigok commented Nov 3, 2013

No, this problem was not cousin my app to crash.

@othiym23
Copy link
Contributor

othiym23 commented Nov 3, 2013

Thanks! Let us know if you run into situations like this where it causes problems for your app. Waiting to hear back from @elliotstokes before resolving this issue.

@othiym23
Copy link
Contributor

othiym23 commented Nov 9, 2013

Closing; Elliot hasn't been back since filing it.

@othiym23 othiym23 closed this as completed Nov 9, 2013
@mike-lang
Copy link

This is happening in my app as well. It doesn't crash my app, but it does clutter my logs with what looks like errors.

I am running node v0.8.26, and my log is filled with stuff that looks like this:

944 <13>1 2013-11-18T19:21:52.184486+00:00 app web.2 - - {"name":"newrelic","hostname":"5ae5d0e1-8d2c-4aaf-9557-91faeeed7bed","pid":2,"component":"data_sender","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Attempting to send metric_data to collector collector-6.newrelic.com failed:","time":"2013-11-18T19:21:52.184Z","v":0}
{"name":"newrelic","hostname":"5ae5d0e1-8d2c-4aaf-9557-91faeeed7bed","pid":2,"component":"connection","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"An error occurred sending message metricData (remote name metric_data):","time":"2013-11-18T19:21:52.184Z","v":0}
{"name":"newrelic","hostname":"5ae5d0e1-8d2c-4aaf-9557-91faeeed7bed","pid":2,"level":40,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Merging metrics from last harvest cycle because:","time":"2013-11-18T19:21:52.184Z","v":0}

944 <13>1 2013-11-18T19:54:49.619246+00:00 app web.3 - - {"name":"newrelic","hostname":"429c9b48-c0ff-44ac-8522-745d47c1b417","pid":2,"component":"data_sender","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Attempting to send metric_data to collector collector-6.newrelic.com failed:","time":"2013-11-18T19:54:49.618Z","v":0}
{"name":"newrelic","hostname":"429c9b48-c0ff-44ac-8522-745d47c1b417","pid":2,"component":"connection","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"An error occurred sending message metricData (remote name metric_data):","time":"2013-11-18T19:54:49.619Z","v":0}
{"name":"newrelic","hostname":"429c9b48-c0ff-44ac-8522-745d47c1b417","pid":2,"level":40,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Merging metrics from last harvest cycle because:","time":"2013-11-18T19:54:49.619Z","v":0}

944 <13>1 2013-11-18T20:49:49.778274+00:00 app web.2 - - {"name":"newrelic","hostname":"5ae5d0e1-8d2c-4aaf-9557-91faeeed7bed","pid":2,"component":"data_sender","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Attempting to send metric_data to collector collector-6.newrelic.com failed:","time":"2013-11-18T20:49:49.777Z","v":0}
{"name":"newrelic","hostname":"5ae5d0e1-8d2c-4aaf-9557-91faeeed7bed","pid":2,"component":"connection","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"An error occurred sending message metricData (remote name metric_data):","time":"2013-11-18T20:49:49.778Z","v":0}
{"name":"newrelic","hostname":"5ae5d0e1-8d2c-4aaf-9557-91faeeed7bed","pid":2,"level":40,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Merging metrics from last harvest cycle because:","time":"2013-11-18T20:49:49.778Z","v":0}

944 <13>1 2013-11-18T20:49:50.160367+00:00 app web.3 - - {"name":"newrelic","hostname":"429c9b48-c0ff-44ac-8522-745d47c1b417","pid":2,"component":"data_sender","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Attempting to send metric_data to collector collector-6.newrelic.com failed:","time":"2013-11-18T20:49:50.159Z","v":0}
{"name":"newrelic","hostname":"429c9b48-c0ff-44ac-8522-745d47c1b417","pid":2,"component":"connection","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"An error occurred sending message metricData (remote name metric_data):","time":"2013-11-18T20:49:50.160Z","v":0}
{"name":"newrelic","hostname":"429c9b48-c0ff-44ac-8522-745d47c1b417","pid":2,"level":40,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Merging metrics from last harvest cycle because:","time":"2013-11-18T20:49:50.160Z","v":0}

944 <13>1 2013-11-18T21:02:57.580381+00:00 app web.1 - - {"name":"newrelic","hostname":"94cfb481-945e-4b04-9d32-73a48323ac97","pid":2,"component":"data_sender","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Attempting to send metric_data to collector collector-6.newrelic.com failed:","time":"2013-11-18T21:02:57.580Z","v":0}
{"name":"newrelic","hostname":"94cfb481-945e-4b04-9d32-73a48323ac97","pid":2,"component":"connection","level":20,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"An error occurred sending message metricData (remote name metric_data):","time":"2013-11-18T21:02:57.581Z","v":0}
{"name":"newrelic","hostname":"94cfb481-945e-4b04-9d32-73a48323ac97","pid":2,"level":40,"err":{"message":"socket hang up","name":"Error","stack":"Error: socket hang up\n    at createHangUpError (http.js:1379:15)\n    at Socket.socketCloseListener (http.js:1430:23)\n    at Socket.EventEmitter.emit (events.js:126:20)\n    at Socket._destroy.destroyed (net.js:358:10)\n    at Proxy.callback.args.(anonymous function) (/app/node_modules/nodetime/lib/core/proxy.js:131:20)\n    at asyncWrap (/app/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)\n    at process.startup.processNextTick.process._tickCallback (node.js:245:9)","code":"ECONNRESET"},"msg":"Merging metrics from last harvest cycle because:","time":"2013-11-18T21:02:57.581Z","v":0}

1116 <13>1 2013-11-18T21:11:33.294574+00:00 app web.2 - - {"name":"newrelic","hostname":"5ae5d0e1-8d2c-4aaf-9557-91faeeed7bed","pid":2,"component":"express","level":10,"err":{"message":"/app/lib/../views/newurltransformrule.jade:17\n    15|                 select(id='rule_type', name='rule_type')\n    16|                     option(value='source_match') Source Match\n  > 17| #                    option(value='url_match') URL Match\n    18|             .formRow\n    19|                 .fieldLabel\n    20|                     label(for='url_match_regex') URL Matching Pattern (optional):\n\nunexpected token \"indent\"","name":"Error","stack":"Error: /app/lib/../views/newurltransformrule.jade:17\n    15|                 select(id='rule_type', name='rule_type')\n    16|                     option(value='source_match') Source Match\n  > 17| #                    option(value='url_match') URL Match\n    18|             .formRow\n    19|                 .fieldLabel\n    20|                     label(for='url_match_regex') URL Matching Pattern (optional):\n\nunexpected toke
n \"indent\"\n    at Object.Parser.parseExpr (/app/node_modules/jade/lib/parser.js:237:15)\n    at Object.Parser.parse (/app/node_modules/jade/lib/parser.js:130:25)\n    at parse (/app/node_modules/jade/lib/jade.js:100:62)\n    at Object.exports.compile (/app/node_modules/jade/lib/jade.js:163:9)\n    at Function.exports.compile (/app/node_modules/express/lib/view.js:68:33)\n    at ServerResponse.res._render (/app/node_modules/express/lib/view.js:417:18)\n    at ServerResponse.res.render (/app/node_modules/express/lib/view.js:318:17)\n    at ServerResponse.render (/app/node_modules/newrelic/lib/instrumentation/express.js:120:21)\n    at module.exports.app.post.rule_priority (/app/lib/routes/url_transform_routes.js:38:13)\n    at callbacks (/app/node_modules/express/lib/router/index.js:272:11)","path":"/app/lib/../views/newurltransformrule.jade"},"msg":"Express 2 rendering for metric View/newurltransformrule/Rendering failed for transaction 1697:","time":"2013-11-18T21:11:33.294Z","v":0}

Can you provide any advice on what this is? It looks like it is trying to report an inability to send gathered data back to new relic's servers. If these messages are expected in normal operation is there anything that can be done to configure the verbosity? Or is that something that would have to be handled down in that continuation-local-storage module?

@brunkle brunkle reopened this Nov 19, 2013
@elliotstokes
Copy link
Author

Sorry I haven't been around for the last few weeks. I can confirm that the application does not crash as others have already mentioned.

@spmason
Copy link

spmason commented Nov 21, 2013

I've been seeing this error too, about once a day. We're using node 0.8.7 (though plan to upgrade to 10.latest soon)

The app seems to be up during this time, or at least NR doesn't report it as being down (it mostly happens outside of office hours)..

@othiym23
Copy link
Contributor

Hello, everybody!

A few responses:

  1. These are actual errors!
  2. In the case of @mike-lang, they are indeed because of transient downtime of New Relic's servers. I can't say if that's the case for all of you.
  3. The chattiness of the logs can be minimized by setting logging.level (or NEW_RELIC_LOG_LEVEL, if you're configuring with environment variables) to INFO or WARN. If you're running New Relic in your app in production, this is a good idea anyway.
  4. In general, the messages at WARN level are important, because you may have altered your host configuration in such a way that New Relic for Node can't report back to our servers, which is why the message about data being reaggregated is at a higher log level.

Otherwise, this is all behaving as designed. Thank for the additional details!

@mike-lang
Copy link

Thank you much for the responses here @othiym23!

@spmason
Copy link

spmason commented Nov 22, 2013

I had another 67 of these errors reported in my control panel this morning - this issue is causing our error rate to spike & NR is emailing us about that, and it's completely outside of our control to fix

"behaving as designed" doesn't seem an appropriate resolution to this issue given that

@othiym23
Copy link
Contributor

@spmason Are you sure the errors in your error count are coming from failed connection attempts to New Relic? The logic to talk to our servers lives in a completely different piece of the module that isn't tied to the piece that traces app errors. I agree that if that's happening, it's a bug (and an annoying one!), but looking at the code and our test output, I'm not seeing any evidence that anything like this is what's going on.

@spmason
Copy link

spmason commented Nov 22, 2013

It seems to be coming from the newrelic module:

Error: socket hang up
at createHangUpError (http.js:1263:15)
at CleartextStream.socketCloseListener (http.js:1314:23)
at CleartextStream.EventEmitter.emit (events.js:115:20)
at SecurePair.destroy (tls.js:904:22)
at asyncWrap (/home/<redacted>/node_modules/newrelic/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:299:29)
at process.startup.processNextTick.process._tickCallback (node.js:244:9)

Also, the Request parameters list the URL as coming from socket.io, but we've set an ignore rule for that URL elsewhere and it's not being reported otherwise - should this even be captured I wonder?

@othiym23
Copy link
Contributor

Right now, New Relic doesn't support SSL between the module and our servers (I'm adding support for that right now!), so that error isn't coming from our code. New Relic does appear in the stack trace, but that's only because of the instrumentation, which more or less guarantees that CLS or the asyncListener (the enabling technologies used by the module) are going to be in every stack trace coming out of an asynchronous callback or event listener.

Ignoring rules apply to the socket.io requests, but not to errors, which are always traced unless they're tied to an HTTP status code that you're specifically ignoring. Also, if it seems like the ignoring rule isn't sticking and you're serving your socket.io routes from Express or Restify, you should take a look at using the API call for ignoring transactions instead of using rules – because of how the Express and Restify router introspection work, they override the ignoring rules which are meant for use by developers working without a framework that names their routes.

@spmason
Copy link

spmason commented Nov 22, 2013

Thanks for the explanation(s) @othiym23, that all makes perfect sense. We talk to some downstream server over ssl so it must be those..

Sorry if I wasted anyone's time here, at least if someone else has the same problem this ticket might help them out (or at least not blame you guys!)

Thanks again for the help

cmcadams-newrelic pushed a commit to cmcadams-newrelic/node-newrelic that referenced this issue Jan 29, 2024
…sequelize-app/app/semver-and-newrelic-7.5.3

chore(deps): bump semver and newrelic in /sequelize-app/app
jsumners-nr pushed a commit to jsumners-nr/node-newrelic that referenced this issue Apr 11, 2024
bizob2828 pushed a commit to bizob2828/node-newrelic that referenced this issue Apr 19, 2024
bizob2828 pushed a commit to bizob2828/node-newrelic that referenced this issue Apr 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants