Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak #94

Closed
dottodot opened this issue Sep 30, 2018 · 13 comments · Fixed by #135
Closed

Memory leak #94

dottodot opened this issue Sep 30, 2018 · 13 comments · Fixed by #135
Labels

Comments

@dottodot
Copy link

I pretty sure I have a really big memory leak

I'm using memwatch-next and it regularly report heap growth such as

Memory leak detected:
 { growth: 512,
  reason: 'heap growth over 5 consecutive GCs (12s) - 409.2 mb/hr' }

This is resulting in the app crashing fairly regularly

Using chrome inspect i taken some snapshots while doing several uploads and one thing that keeps increasing is the strings. On the screenshots attached the thing that looks strange is when you view the details on the string the seem to be a never ending nested "event created"

screen shot 2018-09-30 at 11 56 03

screen shot 2018-09-30 at 11 56 15

One thing I believe this is due to is the use of debug which is logging every event to memory. So if you have a lot of events it going to cause issue as far as I can tell. debug doesn't serve any purpose in production so there should be an option to turn it off, or preferable have it turned of by default which the option to turn it on if needed.

Happy to be proved wrong but app seem to have consistent memory usage when sync isn't running.

@dottodot
Copy link
Author

Ok so I don't think it is debug but having sync enabled it the only thing that causes an increase in string in memory and I guess it something to do with that strange endless nested strings but I can't work out where its coming from.

@daffl
Copy link
Member

daffl commented Sep 30, 2018

Have you tried if this is also happening with other queueing mechanisms (Redis or RabbitMQ)?

@dottodot
Copy link
Author

dottodot commented Sep 30, 2018

Yes just tried in and it crashed the server in about a minute

@daffl
Copy link
Member

daffl commented Oct 1, 2018

I tried this with the test application setup by sending 10 events per second with a 7Kb Base64 encoded payload and am not able to confirm a memory leak in feathers-sync. Here are the memory snapshots after one, two and three minutes that show a constant memory use:

screen shot 2018-10-01 at 1 51 53 pm

By default, the debug module is not enabled unless the DEBUG environment variable is set and the debug calls in feathers-sync only logs basic information (no data). If you still believe this is a Feathers problem I'll need a minimal example to reproduce the problem.

@dottodot
Copy link
Author

dottodot commented Oct 2, 2018

I wondering if I'm maybe seeing a false positive on the memory leak, but my main issue is that my app keeps crashing when sync is on so I was assuming it was something to do with the leaks I was seeing.

Here's the warning I got over the past 12 hours, but the heap size does keep going back down to a normal level

Memory leak detected:
{ growth: 4526280,
Program is using 101738768 bytes of Heap.
reason: 'heap growth over 5 consecutive GCs (2m 2s) - 127.37 mb/hr' }
Memory leak detected:
{ growth: 3776520,
reason: 'heap growth over 5 consecutive GCs (50s) - 259.31 mb/hr' }
Program is using 96607120 bytes of Heap.
(node:35) DeprecationWarning: collection.remove is deprecated. Use deleteOne, deleteMany, or bulkWrite instead.
Memory leak detected:
{ growth: 4185936,
reason: 'heap growth over 5 consecutive GCs (47s) - 305.77 mb/hr' }
Program is using 129948336 bytes of Heap.
Memory leak detected:
{ growth: 5362256,
reason: 'heap growth over 5 consecutive GCs (1m 16s) - 242.23 mb/hr' }
Program is using 96976896 bytes of Heap.
Memory leak detected:
{ growth: 3808976,
reason: 'heap growth over 5 consecutive GCs (45s) - 290.6 mb/hr' }
Program is using 95586000 bytes of Heap.
Memory leak detected:
{ growth: 3352624,
reason: 'heap growth over 5 consecutive GCs (1m 4s) - 179.85 mb/hr' }
Program is using 96810920 bytes of Heap.
Memory leak detected:
{ growth: 4881736,
reason: 'heap growth over 5 consecutive GCs (46s) - 364.35 mb/hr' }
Program is using 101280064 bytes of Heap.
Memory leak detected:
{ growth: 5007872,
reason: 'heap growth over 5 consecutive GCs (3m 6s) - 92.44 mb/hr' }
Program is using 101886016 bytes of Heap.

@daffl
Copy link
Member

daffl commented Oct 2, 2018

I'm not saying there isn't an issue but there unfortunately ins't a lot to go on to help out.

How many event listeners are there in your application? Do all events get delivered? The event systems wasn't really intended to send very large (Base64 encoded) payloads so there might be some limitations. When using Redis instead, what crashed? The connection? Memory? Infinite loop?

@dottodot
Copy link
Author

dottodot commented Oct 2, 2018

Event listeners could be the issue as there are quite a few services, can they be turned off on a service. Unless I done something wrong I only delivering then in the following circumstances.

  app.service('order').publish((data, context) => {
    return app.channel('admins');
  });

  app.service('order-comment').publish((data, context) => {
    return app.channel('admins');
  }); 

I need to look into the redis issue further.

@dottodot
Copy link
Author

dottodot commented Oct 6, 2018

OK found a perfect example that shows a definite difference increase memory use when sync is enabled.

Uploading multi file using feathers blob to s3

The first screen shot is without sync the second with. This was uploading 40 files in succession.

screen shot 2018-10-06 at 09 09 16
screen shot 2018-10-06 at 09 07 46

Now it most likely due to what you said about the event containing the base64 payload and I'm assuming is relates to these 2 issues
feathersjs/feathers#922
feathersjs/feathers#862

Adding require('@feathersjs/feathers').SKIP; to my upload create hook seems to resolve it.

@dottodot
Copy link
Author

dottodot commented Oct 7, 2018

Actually don't think that's helped much as still seeing high heap growth figures. Got it to work with redis now so will see if that helps at all. Usually doesn't take long for it all to come crashing down so will know pretty soon.

@EliSadaka
Copy link

Our app is also experiencing memory leak issues and I believe the source of them is feathers-sync since we were not experiencing them before we started using it. However, our deployment that's only on a single instance with multiple cores isn't experiencing any memory issues. It only seems to be our multi-instance cluster that is having this issue where all instances have steady memory leaks that eventually lead to crashes. Both deployments are essentially the same app with very minor differences so I don't think the issue lies in our code.

@daffl daffl added the bug label Dec 17, 2018
@daffl
Copy link
Member

daffl commented Dec 17, 2018

There definitely could be an issue, the problem is that it'll be hard to track down. The first thing I'd recommend to try if it is also happening with another transport. If it happens there, too it is a problem with this module but I have a suspicion this has something to do with the MongoDB mubsub plugin.

@christo-ph
Copy link

Coming from #92 because @dottodot linked that.

While my threads die with MongoDB and feathers-sync as described in #92 it worked with Redis without any memory footprint increase.

@daffl
Copy link
Member

daffl commented Aug 15, 2019

It's definitely the Mubsub package which just seems overall broken.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants