Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JavaScript heap out of memory on version 6.2.3 #11641

Closed
dcharbonnier opened this issue Apr 7, 2022 · 4 comments
Closed

JavaScript heap out of memory on version 6.2.3 #11641

dcharbonnier opened this issue Apr 7, 2022 · 4 comments
Milestone

Comments

@dcharbonnier
Copy link

The cursor is leaking memory
I will try the workarounds described here #9864

for await (const fileStorageDocument of ManagedFileStorageModel.find({
      mode: StorageMode.create,
      createdAt: {$lte: new Date(Date.now() - 1 * HOUR)}
    })
      .populate(["file", "storage"])
      .cursor()) {
        console.log(`${file.id}`
}
<--- Last few GCs --->

[59:0x7fc8ac496340]   964351 ms: Scavenge 2018.0 (2083.2) -> 2012.3 (2083.2) MB, 6.4 / 0.0 ms  (average mu = 0.629, current mu = 0.314) task 
[59:0x7fc8ac496340]   964488 ms: Scavenge 2018.8 (2083.2) -> 2013.0 (2083.2) MB, 6.8 / 0.1 ms  (average mu = 0.629, current mu = 0.314) task 
[59:0x7fc8ac496340]   964661 ms: Scavenge 2019.2 (2083.4) -> 2013.8 (2099.4) MB, 8.4 / 0.0 ms  (average mu = 0.629, current mu = 0.314) task 


<--- JS stacktrace --->

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
@vkarpov15 vkarpov15 added this to the 6.2.14 milestone Apr 14, 2022
@Uzlopak
Copy link
Collaborator

Uzlopak commented Apr 20, 2022

If you provide a mvce I could have a look at it.

@Uzlopak
Copy link
Collaborator

Uzlopak commented Apr 21, 2022

@dcharbonnier

Hi, what happened? Did you solve it?

@vkarpov15 vkarpov15 reopened this Apr 21, 2022
@vkarpov15
Copy link
Collaborator

vkarpov15 commented May 6, 2022

I was able to repro this. I ran the below script with node --max-old-size=128 to limit max memory usage to 128MB. The script ran out of memory after about 191k documents. So there may very well be a leak, but a very slow one.

Getting to the bottom of this will be tricky, because this script makes my laptop overheat :)

'use strict';

const mongoose = require('mongoose');

run().catch(err => console.log(err));

async function run() {
  await mongoose.connect('mongodb://localhost:27017/test');

  const File = mongoose.model('File', mongoose.Schema({ name: String }));
  const Storage = mongoose.model('Storage', mongoose.Schema({ data: String }));
  const ManagedFileStorageModel = mongoose.model('ManagedFileStorage', mongoose.Schema({
    file: { type: 'ObjectId', ref: 'File' },
    storage: { type: 'ObjectId', ref: 'Storage' }
  }));

  const count = await ManagedFileStorageModel.countDocuments();
  if (count < 500000) {
    for (let i = 0; i < 500000 - count; ++i) {
      if (i % 1000 === 0) console.log(i);
      const file = await File.create({ name: 'test' });
      const storage = await Storage.create({ data: 'foobar' });
      const doc = await ManagedFileStorageModel.create({
        file: file._id,
        storage: storage._id
      });
    }
  }

  console.log('Executing cursor');
  let docs = 0;
  for await (const fileStorageDocument of ManagedFileStorageModel.find().populate(['file', 'storage']).cursor()) {
    if (++docs % 1000 === 0) console.log(docs);
  }

  console.log('Done');
}

A workaround would be to use lean(). Or avoid using a cursor and instead load 10 documents at a time using find().

@vkarpov15
Copy link
Collaborator

Fixed and confirmed a very slow memory leak when using .cursor() with populate(). Fix will be in v6.3.3 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants