Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FinalizationGroup in combination with buffers seems to allocate memory without releasing it #30853

Closed
puzpuzpuz opened this issue Dec 8, 2019 · 4 comments
Labels
memory Issues and PRs related to the memory management or memory footprint. v8 engine Issues and PRs related to the V8 dependency.

Comments

@puzpuzpuz
Copy link
Member

puzpuzpuz commented Dec 8, 2019

Recently I was working on an experimental Buffer pool implementation based on FinalizationGroup API (see #30683) and encountered a weirdness that I'd like clarify. It seems that node doesn't free memory when FinalizationGroup is used to track Buffers (as "holdings") under certain conditions. This issue may be related with off-heap memory allocator or OS memory behavior, but I'd like to confirm that.

Here is the most simple reproducer that I could find:

'use strict';

const fg = new FinalizationGroup(finalizer);

// 8 ticks, 1GB per each => 8GB total
const ticks = 8;
const bufsPerTick = 1024;
const size = 1024 * 1024;

let slices = [];
let tick = 0;
setInterval(() => {
  tick += 1;
  if (tick === ticks) console.log('Registered all slices');
  if (tick > ticks) {
    slices = [];
    return;
  }

  slices = [];
  for (let i = 0; i < bufsPerTick; i++) {
    const buf = Buffer.alloc(size);
    const slice = buf.slice();
    slices.push(slice);
    fg.register(slice, buf);
  }
}, 500);

let finalized = 0;
function finalizer(iter) {
  for (const _ of iter) {
    finalized += 1;
    if (finalized === ticks * bufsPerTick)
      console.log('All finalizer callbacks are triggered');
  }
}

When this script is run under --harmony-weak-refs flag, node process consumes about 2GB of physical memory on my machine (and about 2.5GB of virtual memory) and that value doesn't decrease even after 10 minutes.

On the other hand, if you comment the fg.register(slice, buf); line, you'll see that resident memory consumption eventually goes down to ~32KB (virtual is ~600MB).

Once again, I'm not sure if that's a bug, but I'd like to understand the reason of such behavior.

@Jamesernator
Copy link

There's a decent chance it's on the v8 side, check this thread for implementation status: https://bugs.chromium.org/p/v8/issues/detail?id=8179

Also try on Node 13, it's possible the issue has already been fixed.

@puzpuzpuz
Copy link
Member Author

Thanks @Jamesernator. I've tried running the script on v13.3.1-nightly20191204355b48bd06 and still see the same behavior with FinalizationGroup.register being used.

What's even more interesting, with commented fg.register(slice, buf); line, I can see about 900MB of resident memory consumption and it doesn't go down. It's totally different from memory consumption of v12.13.1 in the same scenario.

So, maybe it's not related with FG API itself and it's something different?

@BridgeAR BridgeAR added memory Issues and PRs related to the memory management or memory footprint. v8 engine Issues and PRs related to the V8 dependency. labels Dec 20, 2019
@puzpuzpuz
Copy link
Member Author

This issue may be related with suboptimal behavior of glibc, which may have large RSS values in such benchmarks. See #21973

@puzpuzpuz
Copy link
Member Author

Closing this one, as it doesn't seem to be related with node or v8. Feel free to comment if you also experience the same behavior.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
memory Issues and PRs related to the memory management or memory footprint. v8 engine Issues and PRs related to the V8 dependency.
Projects
None yet
Development

No branches or pull requests

3 participants