Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential Memory Leak #335

Closed
kyork-cl opened this issue Mar 29, 2017 · 11 comments
Closed

Potential Memory Leak #335

kyork-cl opened this issue Mar 29, 2017 · 11 comments

Comments

@kyork-cl
Copy link
Contributor

I'm investigating the source of a system crash in our product and I think I have narrowed it down to the "argo-resource" dependency. If you install a resource:request:before handler, the memory consumption of the program continues to grow until it uses all system memory and crashes. If you install a request instead, the memory leak does not occur.

    var zetta = require('zetta');
    
    zetta()
      // .use(handleWithLeak)
      .use(handleWithoutLeak)
      .listen(1337, function zettaReady() {
        console.log('Running...');
      });
    
    function handleWithLeak(server) {
      server.httpServer.cloud.use(function setupResourceHandler(handle) {
        handle('resource:request:before', function resourceHandler(env, next) {
          next(env);
        });
      });
    }
    
    function handleWithoutLeak(server) {
      server.httpServer.cloud.use(function setupRequestHandler(handle) {
        handle('request', function requestHandler(env, next) {
          next(env);
        });
      });
    }

I'll continue to dig into the "argo-resource" dependency and see if I can locate the source of the leak.

@kyork-cl
Copy link
Contributor Author

The leak is fairly slow, so I'm using Vegeta to expedite the process. echo "GET http://127.0.0.1:1337/" | vegeta attack | vegeta report

@wooliet
Copy link
Contributor

wooliet commented Mar 29, 2017

@AdamMagaluk argo-resource/resources.js method _setupRequest checks for two special pipelines, "resource:request:before" and "resource:request:after". In both cases, if found, the siphon method of the pipeline is called.

var pre = context.env.pipeline('resource:request:before');
if (pre) {
  pre.siphon(context.env, function(env) {

This goes into pipeworks/pipeworks.js method siphon, which in turn calls fit.

If I update fit with a check to see if the pipeline is already built, the memory issues go away.

Pipeworks.prototype.fit = function(options, pipe) {
  if(this.state === 'built') {
    return this;
  }
}

This is just a guess on my part as far as the logic goes. The build process creates LinkedList.Node instances, which look to be a big contributor to the memory problem. I imagined that once a pipeline has been through its build process, it does not need to go through it again.

@mdobson
Copy link
Contributor

mdobson commented Mar 29, 2017

Interesting. Rebuilding that pipe every time would definitely make the memory expand.

@wooliet
Copy link
Contributor

wooliet commented Mar 29, 2017

Would it be possible for argo/builder.js to be updated with the "resource:" pipes? And the build method handles the siphon creation and such?

@mdobson
Copy link
Contributor

mdobson commented Mar 29, 2017

Possibly. I don't have cycles to do it though.

@wooliet
Copy link
Contributor

wooliet commented Mar 30, 2017

Good news. This is not an issue with latest version of pipeworks (1.3.1), and has actually been fixed since 1.3.0 (released March 2015).

Here is the commit:
kevinswiber/pipeworks@3f7d02e

@mdobson or @AdamMagaluk : can you bump the version in all relevant zetta package files?

@wooliet
Copy link
Contributor

wooliet commented Mar 30, 2017

@AdamMagaluk
Copy link
Collaborator

Sorry for the delay, ill get this updated and a zetta release done by end of weekend. Thanks for the debugging effort.

@AdamMagaluk
Copy link
Collaborator

@wooliet Is it safe to close this now?

@wooliet
Copy link
Contributor

wooliet commented Apr 10, 2017

I'll leave that to @kyork-cl, since he opened the issue.

@kyork-cl
Copy link
Contributor Author

Yes, the issue looks to be resolved now. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants