Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak #197

Closed
UnbrandedTech opened this issue Jul 24, 2018 · 11 comments
Closed

Memory leak #197

UnbrandedTech opened this issue Jul 24, 2018 · 11 comments
Labels
bug Something isn't working

Comments

@UnbrandedTech
Copy link

What's the ETA on a KoaJS integration?
We're trying to manually wire up dd-trace but it's causing memory leaks.

The plateaus in the image is when we reverted the commit, after 4 attempts to figure out what was causing the ENOMEM errors.

screen shot 2018-07-24 at 8 13 21 am

@UnbrandedTech
Copy link
Author

Heres our code that caused the memory leak.

DataDog.js

'use strict'

const { Tags, FORMAT_HTTP_HEADERS } = require('opentracing')

const tracer = require('dd-trace').init({
    service: 'API',
    plugins: false,
    env: process.env.NODE_ENV,
})

tracer.use('mysql2')
tracer.use('redis')

module.exports.tracer = tracer
module.exports.Tags = Tags
module.exports.FORMAT_HTTP_HEADERS = FORMAT_HTTP_HEADERS

DataDogMiddleware.js

'use strict'

const { tracer, Tags, FORMAT_HTTP_HEADERS } = require('../lib/DataDog')

//an array of routes to ignore i.e. '/status' or '/health'
const ignored_routes = require('../lib/constants/ignoredRoutes')

//an array of acceptable methods i.e. 'GET' or 'POST'
const whitelisted_methods = require('../lib/constants/whitelistedMethods')

module.exports = async (ctx, next) => {
    if (whitelisted_methods.includes(ctx.method) && !ignored_routes.includes(ctx.path)) {
        const span = tracer.startSpan('koajs.request', {
            childOf: tracer.extract(FORMAT_HTTP_HEADERS, ctx.request.headers),
            tags: {
                [Tags.SPAN_KIND]: Tags.SPAN_KIND_RPC_SERVER,
                [Tags.HTTP_URL]: ctx.request.href,
                [Tags.HTTP_METHOD]: ctx.request.method
            }
        })

        const scope = tracer.scopeManager().activate(span)

        await next()

        span.setTag('resource.name', ctx.request.path)

        span.setTag('service.name', 'API')
        span.setTag('span.type', 'web')
        span.setTag(Tags.HTTP_STATUS_CODE, ctx.response.statusCode)

        if (ctx.response.statusCode > 500) span.setTag(Tags.ERROR, true)

        span.finish()
        scope.close()
    } else {
        await next()
    }
}

@SethDavenport
Copy link

SethDavenport commented Jul 25, 2018

Seeing something very similar in my codebase.

I'm using dd-trace-js@0.4.0 and the heroku buildpack. Here's a screenshot of Heroku's dyno metrics:

image

That point where the mem starts to climb coincides with the deployment of the agent/buildpack.

In terms of code I'm doing almost the exact same thing as @UnbrandedTech (looks like we both copied and adapted the express middleware for use with Koa 😂 ):

// @flow
import { tracer } from 'initializers/datadog';
import type { Middleware } from 'koa';

// Customized and Koa-ified from the express middleware that ships with dd-trace:
// https://github.com/DataDog/dd-trace-js/blob/master/src/plugins/express.js
//
// See https://datadog.github.io/dd-trace-js/#integrations
// See also https://github.com/koajs/koa/blob/master/docs/api/context.md

const OPERATION_NAME = 'koa.request';

const validateStatus = code => code < 500;

export const datadog: Middleware = async (ctx, next) => {
  const { method, url } = ctx;

  const span = tracer.startSpan(OPERATION_NAME, {
    tags: {
      'http.url': url,
      'http.method': method,
    },
  });

  const scope = tracer.scopeManager().activate(span);

  await next();

  const resourceName = ctx._matchedRoute === '(.*)' ? ctx.url : ctx._matchedRoute;
  span.setTag('resource.name', `${method} ${resourceName}`);
  span.setTag('span.type', 'http');
  span.setTag('http.status', ctx.status);

  if (!validateStatus(ctx.status)) {
    span.setTag('error', true);
  }

  span.finish();
  scope.close();
};

@SethDavenport
Copy link

Trying with 0.5.1. Will update once I have more data.

@UnbrandedTech
Copy link
Author

Very interested if it works, It took our system down so I'm not willing to test it unless I get a confirmation from the Dev team that's it's been fixed.

@palazzem palazzem added the bug Something isn't working label Jul 26, 2018
@rochdev
Copy link
Member

rochdev commented Jul 26, 2018

I was able to replicate this and I am currently investigating. I'll let you know as soon as a fix is on the way.

I'll move KoaJS support to a new issue since this discussion is really about the leak and it doesn't seem to be related to the middleware.

@rochdev rochdev changed the title KoaJS Plugin Memory leak Jul 26, 2018
This was referenced Jul 26, 2018
@SethDavenport
Copy link

Yean 0.5.1 exhibits the same behaviour unfortunately. Thanks for looking into it @rochdev!

@rochdev
Copy link
Member

rochdev commented Jul 26, 2018

Can you try with 0.5.2-beta.0? It seems to fix it for me.

@holdenmc
Copy link

holdenmc commented Jul 26, 2018

We were having the same issue described here. We just re-deployed with 0.5.2-beta.0 about 15 minutes ago and it's holding steady memory-wise

(edit: been over an hour now and memory use is still at normal levels)

@rochdev
Copy link
Member

rochdev commented Jul 27, 2018

Fixed by #203

@rochdev rochdev closed this as completed Jul 27, 2018
@rochdev
Copy link
Member

rochdev commented Jul 27, 2018

Released in 0.5.2

@SethDavenport
Copy link

Just tried 0.5.2 - memory usage appears to be stable. Thanks for the quick turnaround!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants