New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add import of v8 heap allocation profile #170

Merged
merged 6 commits into from Oct 8, 2018

Conversation

3 participants
@vmarchaud
Contributor

vmarchaud commented Oct 4, 2018

First PR here, but first thanks for the awesome tool !

We plan to integrate it into our product (https://pm2.io/) to visualize all kind of profile that we can take out of NodeJS, for this we needed to be able to render heap allocation that are done by V8 so here it is !

Sorry for not opening an issue first, we just wanted to see if we were able to implement it before bothering you

Regards

@coveralls

This comment has been minimized.

Show comment
Hide comment
@coveralls

coveralls Oct 4, 2018

Coverage Status

Coverage increased (+0.6%) to 44.482% when pulling b0d4589 on vmarchaud:vmarchaud/import-v8-heap into d78d20e on jlfwong:master.

coveralls commented Oct 4, 2018

Coverage Status

Coverage increased (+0.6%) to 44.482% when pulling b0d4589 on vmarchaud:vmarchaud/import-v8-heap into d78d20e on jlfwong:master.

@jlfwong

Wooooaaaah cool! Thanks for doing this and nice work figuring out where to put the necessary code!

Before this can land, in addition to the code review comments that need resolving, I need a few things:

  1. Instructions for how to record a heap allocation profile like this to be placed into the PR description (I'll use that to update the wiki page: https://github.com/jlfwong/speedscope/wiki/Importing-from-Node.js)
  2. A minimal checked in example of a heap allocation profile placed in this directory within a subdirectory for the relevant node version https://github.com/jlfwong/speedscope/tree/master/sample/profiles/node
  3. Leveraging the checked in allocation profile in 2, a test which imports the profile and asserts a snapshot of the contents, like this: https://github.com/jlfwong/speedscope/blob/master/src/import/v8proflog.test.ts

I also don't use yarn for this project, so if you could please remove the yarn.lock file from this PR, that would be much appreciated!

@@ -103,6 +104,9 @@ async function _importProfileGroup(dataSource: ProfileDataSource): Promise<Profi
} else if (fileName.endsWith('.v8log.json')) {
console.log('Importing as --prof-process v8 log')
return toGroup(importFromV8ProfLog(JSON.parse(contents)))
} else if (fileName.endsWith('.heapprofile')) {

This comment has been minimized.

@jlfwong

jlfwong Oct 4, 2018

Owner

I try to make every kind of profile import even if it has the incorrect extension by using heuristics. Can you please add a sensible heuristic to the list below for the second pass?

Maybe something like:

'head' in parsed && 'callFrame' in parsed['head']
@jlfwong

jlfwong Oct 4, 2018

Owner

I try to make every kind of profile import even if it has the incorrect extension by using heuristics. Can you please add a sensible heuristic to the list below for the second pass?

Maybe something like:

'head' in parsed && 'callFrame' in parsed['head']

This comment has been minimized.

@vmarchaud

vmarchaud Oct 5, 2018

Contributor

I went for 'head' in parsed && 'selfSize' in parsed['head'] because it's more specific to the heap profile format

@vmarchaud

vmarchaud Oct 5, 2018

Contributor

I went for 'head' in parsed && 'selfSize' in parsed['head'] because it's more specific to the heap profile format

Show outdated Hide outdated src/import/v8heapalloc.ts Outdated
@@ -0,0 +1,97 @@
import {Profile, FrameInfo, StackListProfileBuilder} from '../lib/profile'

This comment has been minimized.

@jlfwong

jlfwong Oct 4, 2018

Owner

Is there a URL you can add to a comment here that provides documentation on this file format? Or at least a link that explains how to record this kind of profile?

@jlfwong

jlfwong Oct 4, 2018

Owner

Is there a URL you can add to a comment here that provides documentation on this file format? Or at least a link that explains how to record this kind of profile?

Show outdated Hide outdated src/import/v8heapalloc.ts Outdated
Show outdated Hide outdated src/import/v8heapalloc.ts Outdated
while (true) {
if (typeof currentNode.parent !== 'number') break
const parent = nodeById.get(currentNode.parent)
if (parent === undefined) break

This comment has been minimized.

@jlfwong

jlfwong Oct 4, 2018

Owner

In this case I would just do if (!parent) break, since any node returned by nodeById.get(...) will not be falsy. if (parent != null) would also be fine.

@jlfwong

jlfwong Oct 4, 2018

Owner

In this case I would just do if (!parent) break, since any node returned by nodeById.get(...) will not be falsy. if (parent != null) would also be fine.

This comment has been minimized.

@vmarchaud

vmarchaud Oct 5, 2018

Contributor

Performance wise, it's better to check against the type you need than relying on V8 adding the check for 0/null/undefined for you
Since any Map return undefined if no entry is found, i believe we can check only for undefined

@vmarchaud

vmarchaud Oct 5, 2018

Contributor

Performance wise, it's better to check against the type you need than relying on V8 adding the check for 0/null/undefined for you
Since any Map return undefined if no entry is found, i believe we can check only for undefined

const parent = nodeById.get(currentNode.parent)
if (parent === undefined) break
// push the parent at the begining of the stack
stack.unshift(parent)

This comment has been minimized.

@jlfwong

jlfwong Oct 4, 2018

Owner

Weird! I would've expected doing repeated stack.push(parent) followed by a stack.reverse() at the end to be more efficient than doing repeated unshift calls, but it looks like that's wrong? https://jsperf.com/array-unshift-vs-array-push-and-array-reverse. I would've thought that arrays would be implemented as vectors, making unshift an O(n) operation, but it looks like it's optimized really well somehow.

@jlfwong

jlfwong Oct 4, 2018

Owner

Weird! I would've expected doing repeated stack.push(parent) followed by a stack.reverse() at the end to be more efficient than doing repeated unshift calls, but it looks like that's wrong? https://jsperf.com/array-unshift-vs-array-push-and-array-reverse. I would've thought that arrays would be implemented as vectors, making unshift an O(n) operation, but it looks like it's optimized really well somehow.

Show outdated Hide outdated src/import/v8heapalloc.ts Outdated
Show outdated Hide outdated src/import/v8heapalloc.ts Outdated
@vmarchaud

This comment has been minimized.

Show comment
Hide comment
@vmarchaud

vmarchaud Oct 5, 2018

Contributor

First round of fix, i will the documentation/test later in the day

Contributor

vmarchaud commented Oct 5, 2018

First round of fix, i will the documentation/test later in the day

@vmarchaud

This comment has been minimized.

Show comment
Hide comment
@vmarchaud

vmarchaud Oct 5, 2018

Contributor

Importing a Heap Profile from the Chrome inspector

Like the CPU Profiling, you can use the Chrome Inspector to record an heap allocation profile, just launch your process with the --inspect flag and then go the Memory section :

Clicking the stop button

Click the start button to start recording a profile.

Now, use your application like you want to and click on the Stop button.

Saving the ".heapprofile" file

Click on "save" to save the recorded profile. Save it with the default .heapprofile file extension.

You should be able to open that file in https://www.speedscope.app/.

To view it offline, you'll need to install speedscope via npm:

npm install -g speedscope

Then you should be able to open it like so:

speedscope /path/to/profile.heapprofile
Contributor

vmarchaud commented Oct 5, 2018

Importing a Heap Profile from the Chrome inspector

Like the CPU Profiling, you can use the Chrome Inspector to record an heap allocation profile, just launch your process with the --inspect flag and then go the Memory section :

Clicking the stop button

Click the start button to start recording a profile.

Now, use your application like you want to and click on the Stop button.

Saving the ".heapprofile" file

Click on "save" to save the recorded profile. Save it with the default .heapprofile file extension.

You should be able to open that file in https://www.speedscope.app/.

To view it offline, you'll need to install speedscope via npm:

npm install -g speedscope

Then you should be able to open it like so:

speedscope /path/to/profile.heapprofile

vmarchaud added some commits Oct 5, 2018

@vmarchaud

This comment has been minimized.

Show comment
Hide comment
@vmarchaud

vmarchaud Oct 5, 2018

Contributor

I'm not sure to understand why the test are failing, the CI only log :

This is likely because this test is run in a continuous integration (CI) environment in which snapshots are not written by default.

Do you have any idea why ?

Contributor

vmarchaud commented Oct 5, 2018

I'm not sure to understand why the test are failing, the CI only log :

This is likely because this test is run in a continuous integration (CI) environment in which snapshots are not written by default.

Do you have any idea why ?

@jlfwong

Sweet! I think all that's left is to run npm run jest -- -u on this branch & push the result. Once that's done, CI should pass, and I can merge and deploy the result. Thanks for following through with PR feedback so quickly

import {checkProfileSnapshot} from '../lib/test-utils'
test('importV8HeapAlloc from Chrome', async () => {
await checkProfileSnapshot('./sample/profiles/Chrome/69/Heap-20181005T144546.heapprofile')

This comment has been minimized.

@jlfwong

jlfwong Oct 5, 2018

Owner

The way these tests work is by generating output, then comparing against output that's checked into the repository. To seed the initial data, you'll need to run npm run jest -- -u, which should create local snapshot files that you'll need to check into the repository. I believe that should result in CI passing.

CI is currently failing because jest is run in a mode which does not write snapshots to disk, it only compares them. It doesn't find a corresponding snapshot on disk to compare against since none was checked in, resulting in a test failure.

See: https://jestjs.io/docs/en/snapshot-testing

@jlfwong

jlfwong Oct 5, 2018

Owner

The way these tests work is by generating output, then comparing against output that's checked into the repository. To seed the initial data, you'll need to run npm run jest -- -u, which should create local snapshot files that you'll need to check into the repository. I believe that should result in CI passing.

CI is currently failing because jest is run in a mode which does not write snapshots to disk, it only compares them. It doesn't find a corresponding snapshot on disk to compare against since none was checked in, resulting in a test failure.

See: https://jestjs.io/docs/en/snapshot-testing

import {getOrInsert} from '../lib/utils'
import {ByteFormatter} from '../lib/value-formatters'
/**

This comment has been minimized.

@jlfwong

jlfwong Oct 5, 2018

Owner

This is perfect! Thanks for writing this :)

@jlfwong

jlfwong Oct 5, 2018

Owner

This is perfect! Thanks for writing this :)

@vmarchaud

This comment has been minimized.

Show comment
Hide comment
@vmarchaud

vmarchaud Oct 8, 2018

Contributor

And done ! thanks for the help for generating the snapshot.

Just to tell you also, i'm working to support an old cpu profile format of v8 (mainly outputted by https://github.com/hyj1991/v8-profiler-node8), which is the current format we use ourselves, so an PR will arrive this week i believe ;)

Contributor

vmarchaud commented Oct 8, 2018

And done ! thanks for the help for generating the snapshot.

Just to tell you also, i'm working to support an old cpu profile format of v8 (mainly outputted by https://github.com/hyj1991/v8-profiler-node8), which is the current format we use ourselves, so an PR will arrive this week i believe ;)

@jlfwong jlfwong merged commit 4b292b2 into jlfwong:master Oct 8, 2018

2 checks passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details
coverage/coveralls Coverage increased (+0.6%) to 44.482%
Details
@jlfwong

This comment has been minimized.

Show comment
Hide comment
@jlfwong

jlfwong Oct 8, 2018

Owner

@vmarchaud thanks for the hard work! This is now merged, published to npm, and deployed to https://www.speedscope.app/. I also updated the documentation here: https://github.com/jlfwong/speedscope/wiki/Importing-from-Node.js#importing-a-heap-profile

Just to tell you also, i'm working to support an old cpu profile format of v8 (mainly outputted by https://github.com/hyj1991/v8-profiler-node8), which is the current format we use ourselves, so an PR will arrive this week i believe ;)

Sounds good to me 😄

Owner

jlfwong commented Oct 8, 2018

@vmarchaud thanks for the hard work! This is now merged, published to npm, and deployed to https://www.speedscope.app/. I also updated the documentation here: https://github.com/jlfwong/speedscope/wiki/Importing-from-Node.js#importing-a-heap-profile

Just to tell you also, i'm working to support an old cpu profile format of v8 (mainly outputted by https://github.com/hyj1991/v8-profiler-node8), which is the current format we use ourselves, so an PR will arrive this week i believe ;)

Sounds good to me 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment