Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ram leak? #716

Closed
NotDemonix opened this issue Aug 30, 2023 · 21 comments · Fixed by #786 or #791
Closed

Ram leak? #716

NotDemonix opened this issue Aug 30, 2023 · 21 comments · Fixed by #786 or #791

Comments

@NotDemonix
Copy link

Seems like if I don't use clearAllCache() then the memory goes up every time something is rendered.

@themrrobert
Copy link

themrrobert commented Sep 21, 2023

Confirm this issue.

The issue does not exist in 0.1.38, but it does exist in 0.1.44

Basically there is a huge memory leak, after generating a few hundred images (and discarding them), the RSS usage went up to > 1.4 GB

Even after clearAllCache(), it only recovered around 400MB (1.0 GB used)

In 0.1.38, the same test is always around 600 MB rss, never above 700 MB

In the picture below, I am running 0.1.44, and the following code in the 3 commands:
util.inspect(process.memoryUsage())

clearAllCache()

util.inspect(process.memoryUsage())
image

In the picture below, I am running 0.1.38, and while the RSS does spike to around 1GB while the pictures are generating and shortly after, within 15 seconds the RSS is back to ~600MB as you can see here:
image

Test conditions:

Node v: 18.16.1
Windows 11 and Linux Debian 10.13
@napi-rs/canvas version 0.1.38 (ok) and 0.1.44 (bugged)
I generated 200 images per test, size 488 x 331 via new Canvas() and getContext('2d'), with 56 individual png's loaded as Images via loadImage(), overlaid onto the parent image with Canvas.drawImage(), and a simple text header drawn via 2dcontext.drawText(), exported as pngs, and discarded.
Repeated the test with all images drawn 'at the same time' via promiseArray.push()... await Promise.all(promiseArray); and also await'ing each image, with virtually identical results. The results shown above are from the promiseArray variant.

@el9in
Copy link

el9in commented Nov 30, 2023

I have too memory leak.

@7te3ep
Copy link

7te3ep commented Jan 20, 2024

Same here, even clearAllCache() doesnt work.

@7te3ep
Copy link

7te3ep commented Jan 20, 2024

Confirm this issue.

The issue does not exist in 0.1.38, but it does exist in 0.1.44

Basically there is a huge memory leak, after generating a few hundred images (and discarding them), the RSS usage went up to > 1.4 GB

Even after clearAllCache(), it only recovered around 400MB (1.0 GB used)

In 0.1.38, the same test is always around 600 MB rss, never above 700 MB

In the picture below, I am running 0.1.44, and the following code in the 3 commands: util.inspect(process.memoryUsage())

clearAllCache()

util.inspect(process.memoryUsage()) image

In the picture below, I am running 0.1.38, and while the RSS does spike to around 1GB while the pictures are generating and shortly after, within 15 seconds the RSS is back to ~600MB as you can see here: image

Test conditions:

Node v: 18.16.1 Windows 11 and Linux Debian 10.13 @napi-rs/canvas version 0.1.38 (ok) and 0.1.44 (bugged) I generated 200 images per test, size 488 x 331 via new Canvas() and getContext('2d'), with 56 individual png's loaded as Images via loadImage(), overlaid onto the parent image with Canvas.drawImage(), and a simple text header drawn via 2dcontext.drawText(), exported as pngs, and discarded. Repeated the test with all images drawn 'at the same time' via promiseArray.push()... await Promise.all(promiseArray); and also await'ing each image, with virtually identical results. The results shown above are from the promiseArray variant.

I have the issue in 1.1.38 to

@SteadEXE
Copy link

SteadEXE commented Jan 23, 2024

I confirm the issue, RAM never get released, with 0.1.44
Downgrading to 0.1.37 fixes the issue

@barthuijgen
Copy link

I tested on 0.1.45 and saw no sigificant memory leak when inspecting process.memoryUsage() but once my application was containerised I saw HUGE memory leaks in the docker stats memory usage graphs.

I managed to narrow it down to calling canvas.toBuffer() or canvas.encode(), as soon as I left this line out, the memory leak stopped occuring, but I also had no output...

I'm talking about 60~80MB of memory increase per render of a 8MB PNG.
Consistently, per render, that much increase I was getting over 1GB usage while on 0.1.41 it never goes above 200MB and after being idle for a bit sits on 75MB

And yes I also tried clearAllCache it did nothing.

Since this package has bindings to a native binary, it makes sense nodejs does not detect the memory leak.

I have not seen others report this yet, so figured I would report my findings. Hope it gets fixed sometime, until then pinning version to 0.1.41 works for me.

@7f8ddd
Copy link

7f8ddd commented Feb 16, 2024

@Brooooooklyn If you could fix this, people would be very grateful. It's literally unusable unless you pre-generate images. There is 100% a leak and if you know where it is, that would be helpful for someone making a fix.

@7f8ddd
Copy link

7f8ddd commented Feb 16, 2024

.create_buffer_with_borrowed_data(ptr.cast_mut(), size, 0, noop_finalize)

Is the usage of noop here the cause?

@Brooooooklyn
Copy link
Owner

I'm looking into this issue, what kind of image do you generate with @napi-rs/canvas? I'm trying to create a minimal repro

@SteadEXE
Copy link

SteadEXE commented Feb 19, 2024

In my case, a canvas of size 3000x925.

Method or attribute used (maybe usefull for a reproduction code):
fillStyle, fillRect, textAlign, setLineDash, strokeRect, save, restore, translate, font, textBaseline, beginPath, moveTo, lineTo, closePath, fill, stroke, lineWidth, strokeStyle

Then I call canvas.toBuffer('image/jpeg') to get my image.
Apparently this lead to the memory leak, but I didn't tested myself.

I have 3 registered fonts.

@Brooooooklyn
Copy link
Owner

@SteadEXE thank you, I'll try

@Brooooooklyn
Copy link
Owner

Does anyone actually catch a real-world oom? I'm wondering if the RSS is allocated memory, not the actually used memory.

@SteadEXE
Copy link

SteadEXE commented Feb 19, 2024

In my case, the process eat all the available RAM on Windows, until system itself runs out of memory.
Even after an idle period, RAM never got released, node process had to crash or be restarted.

@SteadEXE
Copy link

After some test, I can confirm after generating 1000+ images on my side I reach 4.5GB+ RAM, while I never go over 300 MB in 0.1.37.

@7f8ddd
Copy link

7f8ddd commented Feb 19, 2024

Does anyone actually catch a real-world oom? I'm wondering if the RSS is allocated memory, not the actually used memory.

The issue is an actual memory leak, as Bun.gc(true) would clear eg. large arrays or other things.
The size of the leak goes up with image quality, so it's likely related to the buffer itself not being destroyed properly.

Just simply generating a few captchas on-demand is basically impossible, which was how I found it.

import { createCanvas } from "@napi-rs/canvas";

function test() {
  const canvas = createCanvas(600, 600);

  const ctx = canvas.getContext("2d");
  ctx.fillStyle = "black";
  ctx.fillRect(0, 0, canvas.width, canvas.height);

  // The memory leak is much worse when the quality is high. It only leaks when this is called.
  await canvas.encode("webp", 100);
}

for (let i = 0; i < 5000; i++) {
  test();
}

setInterval(() => {
  Bun.gc(true);
  console.log(process.memoryUsage().rss / 1024 / 1024);
}, 1000);

@Brooooooklyn
Copy link
Owner

Can you try this version to see if it has fixed your issue? https://github.com/Brooooooklyn/canvas/releases/tag/v0.1.48 ?

@7f8ddd
Copy link

7f8ddd commented Feb 21, 2024

Can you try this version to see if it has fixed your issue? https://github.com/Brooooooklyn/canvas/releases/tag/v0.1.48 ?

Edit: It's only leaking when you encode with png, webp was fixed. Not sure about the others.

For me, it seems like it's still leaking the same amount on 0.1.48, or in any case, it's still leaking regardless.
48k calls to this = 7474 MB.

const canvas = createCanvas(200, 200);
const ctx = canvas.getContext("2d");

ctx.fillStyle = "black";
ctx.fillRect(0, 0, canvas.width, canvas.height);

const encoded = await canvas.encode("png");

@Brooooooklyn
Copy link
Owner

Brooooooklyn commented Feb 22, 2024

@7te3ep I can't reproduce the leak with the code:

import { createCanvas } from './index.js'

async function test() {
  const canvas = createCanvas(2560, 1920)
  const ctx = canvas.getContext('2d')
  ctx.fillStyle = "black";
  ctx.fillRect(0, 0, canvas.width, canvas.height);

  await canvas.encode("webp", 100);
}

for (let i = 0; i < 5000; i++) {
  await test();
  global?.gc?.()
  console.info(process.memoryUsage().rss / 1024 / 1024 + " MB");
}

setInterval(() => {
  console.info(process.memoryUsage().rss / 1024 / 1024 + " MB");
}, 1000);

On my MacBook, the output was like:

285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB

As you can see, the memory usage is very stable

@7f8ddd
Copy link

7f8ddd commented Feb 22, 2024

@7te3ep I can't reproduce the leak with the code:

import { createCanvas } from './index.js'

async function test() {
  const canvas = createCanvas(2560, 1920)
  const ctx = canvas.getContext('2d')
  ctx.fillStyle = "black";
  ctx.fillRect(0, 0, canvas.width, canvas.height);

  await canvas.encode("webp", 100);
}

for (let i = 0; i < 5000; i++) {
  await test();
  global?.gc?.()
  console.info(process.memoryUsage().rss / 1024 / 1024 + " MB");
}

setInterval(() => {
  console.info(process.memoryUsage().rss / 1024 / 1024 + " MB");
}, 1000);

On my MacBook, the output was like:

285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB

As you can see, the memory usage is very stable

I'm not sure if you were replying to me, but it's leaking with png, not webp. webp is fine now.

@Brooooooklyn Brooooooklyn reopened this Feb 22, 2024
@Brooooooklyn
Copy link
Owner

Confirmed that PNG is still leaking, I will fix it immediately.

@SteadEXE
Copy link

For me it looks like the issue is fixed, thank you 🥇

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
8 participants