Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

next/image memory leak? #20915

Closed
Timer opened this issue Jan 8, 2021 · 20 comments
Closed

next/image memory leak? #20915

Timer opened this issue Jan 8, 2021 · 20 comments
Labels
please add a complete reproduction The issue lacks information for further investigation
Milestone

Comments

@Timer
Copy link
Member

Timer commented Jan 8, 2021

Hi,

After the first release of our site in production with Next@v10 (v10.0.2 precisely), we noticed a gradual memory consumption pattern in our servers.

Edit: worth noting that we've skipped 9.5 and upgraded straight from 9.4. So it could be a 9.5 issue.

Edit: maybe not related to Image component, as we stopped using it, and still notice a gradual increase in memory consumption (which didn't happen before v10). See this comment.

Edit: Jan 8th 2021: this is definitely related to Image component. We gave it a second chance, but had to rollback due to very high memory usage.

The only updated library in these releases was Next, and we do use the new Image component in a very limited set of images:

  • I estimate that no more than 50 images are being optimised with the Image component
  • All of them are above the fold (hero images, with a Max Age of 1 hour defined in the Cache Control header at the original source)
  • Our custom config in next.config.js:
  images: {
    domains: ['storage.googleapis.com'],
    deviceSizes: [500, 575, 750, 1150, 1500, 1920],
  },

Has anyone experienced this as well? Could it have any relation with the Image component? If so, it is worrying, as we are using it only for a very limited set of images, and we have plans to adopt it for a much larger set of images (e.g. product images).

As you can see in the image below, up until Next v10 the memory consumption was pretty steady. Let me know if we can provide some more details.

Next 10 releases

Originally posted by @josebrito in #19597

@Timer Timer added kind: bug please add a complete reproduction The issue lacks information for further investigation labels Jan 8, 2021
@Timer Timer added this to the iteration 16 milestone Jan 8, 2021
@timneutkens
Copy link
Member

My initial suspicion is that it's not a memory leak but just an increase in the Node.js fs cache with a lot of images (as they're cached on disk).

@josebrito could you try temporarily disabling the disk cache locally (just commenting it out) and requesting one image many times and see if the memory is garbage collected or not.

@josebrito
Copy link

Hi @timneutkens !

Need a little bit of help there, could you be more specific on what's needed to be commented out? If you can provide a resource I'd appreciate.

My initial suspicion is that it's not a memory leak but just an increase in the Node.js fs cache with a lot of images (as they're cached on disk).

Exactly, but in our case it's a very small set of images, so we'd expect that at some point the memory usage would stabilise. That's not the case, as it grows indefinitely (?).

Also the image provider defines cache-control: public, max-age=3600 so if I understand your docs correctly, we'd expect that existing images would be regenerated every hour.

@CRAKZOR
Copy link

CRAKZOR commented Jan 11, 2021

you're not the only one

#19597

@mattalco
Copy link

mattalco commented Jan 12, 2021

I have the same issue with the new Image component. I just pushed an update this evening to my app on DigitalOcean's App Platform and immediately started seeing high memory usage. I waited about 90 minutes to see if it would settle down but it never did. Once I rolled back the the standard img tag, everything returned to normal. If there's additional logs or info I can provide, please let me know. I'll be happy to provide whatever I can that helps.

Screen Shot 2021-01-11 at 10 06 33 PM

@lukas-bryla
Copy link

Edit: worth noting that we've skipped 9.5 and upgraded straight from 9.4. So it could be a 9.5 issue.

I believe this is not 9.5 issue, since I was upgrading from "next": "9.5.3" to "next": "^10.0.5" and I did not have any issues before

@mattalco
Copy link

hey @Timer, if you're still looking for a reproduction of this, I'd be happy to grant you access to my private project that manifests this issue.

@jondcallahan
Copy link

I am seeing this also, even using just 1 next/image component. See a before and after of switching from next/image to a picture element wrapping some source elements and an img tag. image

@OmgDef
Copy link

OmgDef commented Jan 26, 2021

Switching to libjemalloc helps. Had a huge memory leakage on Ubuntu 20.04. More info here: lovell/sharp#1803

@bigxd123
Copy link

Switching to libjemalloc helps. Had a huge memory leakage on Ubuntu 20.04. More info here: lovell/sharp#1803

How did you get libjemalloc1 on 20.04? All I can find is libjemalloc2 which crashes eventually in production.

@OmgDef
Copy link

OmgDef commented Feb 15, 2021

Switching to libjemalloc helps. Had a huge memory leakage on Ubuntu 20.04. More info here: lovell/sharp#1803

How did you get libjemalloc1 on 20.04? All I can find is libjemalloc2 which crashes eventually in production.

I installed libjemalloc2 and created a pm2 config for setting env variables.

module.exports = {
  apps : [
      {
        name: "site",
        script: "yarn",
        args : "start",
        watch: true,
        env           : { 'LD_PRELOAD': '/usr/lib/x86_64-linux-gnu/libjemalloc.so.2', 'NODE_ENV': 'production' },
        env_production: { 'LD_PRELOAD': '/usr/lib/x86_64-linux-gnu/libjemalloc.so.2', 'NODE_ENV': 'production' },
      }
  ]
}

@bigxd123
Copy link

Switching to libjemalloc helps. Had a huge memory leakage on Ubuntu 20.04. More info here: lovell/sharp#1803

How did you get libjemalloc1 on 20.04? All I can find is libjemalloc2 which crashes eventually in production.

I installed libjemalloc2 and created a pm2 config for setting env variables.

module.exports = {
  apps : [
      {
        name: "site",
        script: "yarn",
        args : "start",
        watch: true,
        env           : { 'LD_PRELOAD': '/usr/lib/x86_64-linux-gnu/libjemalloc.so.2', 'NODE_ENV': 'production' },
        env_production: { 'LD_PRELOAD': '/usr/lib/x86_64-linux-gnu/libjemalloc.so.2', 'NODE_ENV': 'production' },
      }
  ]
}

Okay I see. We’ve been running 2 in production as well now and so far no crashes. Must have been related to another issue that was fixed.

@timneutkens
Copy link
Member

Please try running next@canary, we've rewritten the image optimization to no longer use sharp and instead depend on the webassembly binaries included in squoosh.app. PR: #22253

@orbiteleven
Copy link

Still seeing this on 10.0.8: modifying a ~7mb file makes the memory jump to 1.2Gb

Screen Shot 2021-03-07 at 16 53 00

@j-mendez
Copy link
Contributor

j-mendez commented Mar 7, 2021

@orbiteleven use next@canary for now or remove the Image component until the next release of next.js. Theres an issue with the image optimization.

@orbiteleven
Copy link

@j-mendez Per the changelog #22253 is in v10.0.8. v10.0.9-canary.0 doesn't seem to be related, but maybe I'm missing something?

@bravetheheat
Copy link

Happening for us too. Got to the point where it was making AWS t3.medium instances unresponsive (4GB ram with 2GB swap)
Downgraded to 10.0.7 and it is much better.

@jnv
Copy link

jnv commented Mar 12, 2021

Regarding the high memory usage in 10.0.8 there's a new bug report in #22925

@BrunoHiis
Copy link

Happened for my project as well, removed all next/image components for now for a quick fix.
image

@lovell
Copy link
Contributor

lovell commented Mar 12, 2021

Hi, I'm the sharp maintainer and I've just been made aware that Next.js switched away from using it, due in part to this perceived memory leak.

The reports here look like the effects of memory fragmentation within the stock glibc-based Linux allocator and its inability to return freed memory to the OS. That's why people who are using the jemalloc memory allocator or the musl-based Alpine are unaffected. There's a lot of background and discussion about this at lovell/sharp#955.

For those still using glibc-based Linux and older versions of Next.js that depend upon sharp, concurrency and therefore the likelihood of fragmentation can be manually controlled via e.g. sharp.concurrency(1) - see https://sharp.pixelplumbing.com/api-utility#concurrency

The forthcoming sharp v0.28.0 will detect which memory allocator is being used at runtime and limit concurrency if required - see lovell/sharp#2607 - hopefully this will allow people to make a more informed choice about the most appropriate memory allocator for their scenario.

I would expect the new Wasm-based approach to exhibit a much higher peak memory requirement as entire decompressed images will be held in memory, possibly for longer periods than previously due to slower encoding performance. I notice there are comments in #22925 which would appear to confirm this.

As always, please do feel free to ask for help at the https://github.com/lovell/sharp repo if you're unsure about the best way in which to use sharp for a given scenario. If you hadn't seen it, there's a meta-issue for the next release at lovell/sharp#2604

@balazsorban44
Copy link
Member

This issue has been automatically locked due to no recent activity. If you are running into a similar issue, please create a new issue with the steps to reproduce. Thank you.

@vercel vercel locked as resolved and limited conversation to collaborators Jan 28, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
please add a complete reproduction The issue lacks information for further investigation
Projects
None yet
Development

No branches or pull requests