Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Version 7.49.0 seems to be leaking memory #7982

Closed
3 tasks done
phil-tutti opened this issue Apr 27, 2023 · 16 comments
Closed
3 tasks done

Version 7.49.0 seems to be leaking memory #7982

phil-tutti opened this issue Apr 27, 2023 · 16 comments

Comments

@phil-tutti
Copy link

Is there an existing issue for this?

How do you use Sentry?

Sentry Saas (sentry.io)

Which SDK are you using?

@sentry/node

SDK Version

7.49.0

Framework Version

@sentry/integrations": "7.49.0, @sentry/node": "7.49.0, @sentry/react": "7.49.0

Link to Sentry event

No response

SDK Setup

  Sentry.init({
    dsn: "someString",
    integrations: [
      new SentryIntegrations.RewriteFrames(),
    ],
    environment: ENVIRONMENT,
    sampleRate: ERROR_SAMPLE_RATE,
  });

  // The request handler must be the first middleware on the app
  app.use(
    Sentry.Handlers.requestHandler({
      request: ["data", "headers", "method", "query_string", "url"],
    })
  );

and

app.use(Sentry.Handlers.errorHandler());

to catch errors

Steps to Reproduce

As written in this PR: #7752 (comment), we started to see a constant increase of memory , that is not getting cleaned up when we switched from 7.46.0 to 7.49.0. Once we reverted this version, everything went back to normal.

For SSR, we're setting up the sentry/node together with sentry/integration package. sentry/browser is not being used on server side.

Here's also an excerpt from our package.json.

"dependencies": {
    "@emotion/cache": "11.10.7",
    "@emotion/react": "11.10.6",
    "@emotion/server": "11.10.0",
    "@emotion/styled": "11.10.6",
    "@formatjs/intl-getcanonicallocales": "2.1.0",
    "@formatjs/intl-locale": "3.1.1",
    "@formatjs/intl-pluralrules": "5.1.10",
    "@formatjs/intl-relativetimeformat": "11.1.10",
    "@loadable/component": "5.15.3",
    "@loadable/server": "5.15.3",
    "@mui/icons-material": "5.11.16",
    "@mui/lab": "5.0.0-alpha.125",
    "@mui/material": "5.11.16",
    "@mui/styled-engine": "5.11.16",
    "@mui/system": "5.11.16",
    "@optimizely/optimizely-sdk": "4.9.3",
    "@reduxjs/toolkit": "1.9.3",
    "@sentry/integrations": "7.49.0",
    "@sentry/node": "7.49.0",
    "@sentry/react": "7.49.0",
    "@tanstack/react-query": "4.29.3",
    "@tanstack/react-query-devtools": "4.29.3",
    "axios": "0.21.2",
    "classnames": "2.3.2",
    "cookie-parser": "1.4.6",
    "core-js": "3.30.0",
    "cross-fetch": "3.1.5",
    "express": "4.18.2",
    "express-useragent": "1.0.15",
    "http-proxy-middleware": "2.0.6",
    "https": "1.0.0",
    "https-browserify": "1.0.0",
    "intl": "1.2.5",
    "ky": "0.33.3",
    "lodash": "4.17.21",
    "process": "0.11.10",
    "prom-client": "14.2.0",
    "react": "18.2.0",
    "react-dom": "18.2.0",
    "react-helmet-async": "1.3.0",
    "react-hook-form": "7.43.9",
    "react-intl": "6.3.2",
    "react-lazyload": "3.2.0",
    "react-redux": "7.2.9",
    "react-router-dom": "6.8.2",
    "react-use": "17.4.0",
    "react-useportal": "1.0.18",
    "react18-input-otp": "1.1.3",
    "redux": "4.2.1",
    "resolve-url-loader": "5.0.0",
    "stream-http": "3.2.0",
    "use-debounce": "9.0.4",
    "use-immer": "0.9.0",
    "usehooks-ts": "2.9.1",
  }

We're using express, with node: 16 running on our servers. We render our react application server-side, together with MUI and emotion

Expected Result

A graph like this for our memory usage over time

Screenshot 2023-04-27 at 07 02 52

Actual Result

Screenshot 2023-04-27 at 07 02 21

@AbhiPrasad
Copy link
Member

I attempted to reproduce this on Node 16 with a simple express app, but couldn't get anywhere. This makes me think there is something else going on 🤔.

https://github.com/AbhiPrasad/GH-7982-node-memory-leak

Maybe there's some weird behavior with Node 16 and async local storage that we haven't run into before.

@phil-tutti is it possible to generate and share a heapdump? You can reference https://nodejs.org/en/docs/guides/diagnostics/memory/using-heap-snapshot for some strategies on this.

@timfish any ideas of what could be going on here?

@timfish
Copy link
Collaborator

timfish commented Apr 27, 2023

So far, I've only seen a memory leak in Node < v18 with the debugger attached but maybe that's not the only trigger.

Maybe there's some weird behavior with Node 16 and async local storage that we haven't run into before.

Yeah, I'd be reluctant to blame Node.js right off the bat, but we have seen a few issues with async hooks. In the above linked issue it's mentioned that there have been improvements/fixes to v8 that underpins a lot of async hooks and these unfortunately can't get backported to older node versions.

Maybe AsyncLocalStorage is more prone to memory leaks from circular refs, etc because of better/different scope capturing?

I would be interesting to see a heap snapshot to see what's actually getting leaked.

@phil-tutti
Copy link
Author

I will see what I can do to get a heapdump. But it might be tricky, as we can only see it on our production environments, as testing environments don't have enough load on them. And deploying the version that consumes more memory on production is always a bit risky :)

@phil-tutti
Copy link
Author

We are sadly not able to provide a heapdump from our side, my apologies.

@Leask
Copy link

Leask commented May 4, 2023

I have the same issue here.

And the memory grows really fast.

Screenshot 2023-05-03 at 8 56 20 PM

I can trace lots of _hub objects created by transaction.js.

Screenshot 2023-05-03 at 8 55 42 PM

@AbhiPrasad
Copy link
Member

Thanks @Leask! I think the issue here is with holding the hub reference, so @timfish with:

Maybe AsyncLocalStorage is more prone to memory leaks from circular refs, etc because of better/different scope capturing

Looks like you're on the money!

We have a cirular ref that is hub -> scope -> transaction -> hub. I think we can try just deleting the hub reference on finish? I don't think there is any consquences from doing that.

@AbhiPrasad
Copy link
Member

Something like:

diff --git a/packages/core/src/tracing/transaction.ts b/packages/core/src/tracing/transaction.ts
index d62f4c3b2..c1fa4e465 100644
--- a/packages/core/src/tracing/transaction.ts
+++ b/packages/core/src/tracing/transaction.ts
@@ -206,7 +206,12 @@ export class Transaction extends SpanClass implements TransactionInterface {
 
     __DEBUG_BUILD__ && logger.log(`[Tracing] Finishing ${this.op} transaction: ${this.name}.`);
 
-    return this._hub.captureEvent(transaction);
+    const id = this._hub.captureEvent(transaction);
+
+    // @ts-ignore We need to remove hub from this transaction in order to prevent endless cycle
+    this._hub = undefined;
+
+    return id;
   }
 
   /**

@timfish
Copy link
Collaborator

timfish commented May 4, 2023

We have a circular ref that is hub -> scope -> transaction -> hub

Nice find!

With your proposed change, would it still leak if finish doesn't get called for whatever reason?

@AbhiPrasad
Copy link
Member

hmm yes it would, which is also unfortunate.

This is tricky because we want it so that if you called hub.startTransaction, that hub is used to finish the transaction. Not sure how to best remove the circular ref.

@Leask
Copy link

Leask commented May 5, 2023

Cool, thanks for the follow-up! 🍻

@timfish
Copy link
Collaborator

timfish commented May 6, 2023

@Leask are you able to share more details of your setup that's causing the leaking or share your heap snapshots with me? I've tried a few basic examples and not managed to reproduce this locally yet.

I do have a possible bug fix for this that removes the circular reference but my concern is that circular references alone should not be causing leaking. Modern browsers should be able to garbage collect basic circular references like this which suggest the cause is more likely an actual reference to something which is stopping collection.

@github-actions
Copy link
Contributor

github-actions bot commented Jul 7, 2023

This issue has gone three weeks without activity. In another week, I will close it.

But! If you comment or otherwise update it, I will reset the clock, and if you label it Status: Backlog or Status: In Progress, I will leave it alone ... forever!


"A weed is but an unloved flower." ― Ella Wheeler Wilcox 🥀

@Leask
Copy link

Leask commented Jul 7, 2023

Sorry about this, my wife delivered a new baby girl. I am so busy these days.
I will check it out later. If this issue reproduces, I will get back here.
Thanks for your work @timfish!

@AbhiPrasad
Copy link
Member

I tried to reproduce this myself on the latest version once again, and was unable to. Closing this issue as a result, but if anyone can get a reproduction, we can re-open and investigate! Thanks!

@AbhiPrasad AbhiPrasad closed this as not planned Won't fix, can't repro, duplicate, stale Jul 27, 2023
@Leask
Copy link

Leask commented Jul 27, 2023

Yes, weeks ago, I upgraded to the latest version and cannot reproduce this issue anymore.
Thanks! 🍻

@phil-tutti
Copy link
Author

I want to give an update here in case someone else runs into the same issue.

We migrated our react app from a custom made solution to next.js and tried to update sentry again. This had the same effect, with the memory slowly growing and eventually running OOM. So we decided to stick to the older version.

Just recently though, we switched from node 18.18.1 to 20.10.0 and updated the sentry version again. We monitored the memory again, and now it seems to run stable.

TL;DR: it seems an update to node version 20.10.0 fixed the memory leak for us

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Archived in project
6 participants