Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve performance of accessing globals in scripts running in vm modules #31658

Open
SimenB opened this issue Feb 6, 2020 · 14 comments
Open
Labels
performance Issues and PRs related to the performance of Node.js. v8 engine Issues and PRs related to the V8 dependency. vm Issues and PRs related to the vm subsystem.

Comments

@SimenB
Copy link
Member

SimenB commented Feb 6, 2020

Is your feature request related to a problem? Please describe.

Accessing globals from within a vm.Script is guarded by interceptors which makes accessing them slow, (from my understanding, @fhinkel has an excellent comment explaining some of this here: jestjs/jest#5163 (comment)). The suggested workaround is to evaluate and cache the global you want access to (Math in my example case below) and inject that into the vm.Script. This is an acceptable workaround for vm.Script and vm.compileFunction but it cannot be done when using ESM vm.SourceTextModule as there is no wrapping function call.

I've put together this script you can run to see the numbers:

// test.js
const {
  compileFunction,
  createContext,
  runInContext,
  SourceTextModule
} = require("vm");
const context = createContext({ console });

const sourceCode = `
  const runTimes = 10000000;
  let sum = 0;
  console.time();
  for (let i = 0; i < runTimes; i++) {
    sum += Math.abs(0);
  }
  console.timeEnd();
`;

(async () => {
  const compiledFunction = compileFunction(sourceCode, [], {
    parsingContext: context
  });

  const vmModule = new SourceTextModule(sourceCode, { context });

  await vmModule.link(() => {
    throw new Error("should never be called");
  });

  // These are both super slow (1.6 seconds on my machine)
  compiledFunction();
  await vmModule.evaluate();

  // however, this is quick (less than 10 milliseconds on my machine)
  const compiledWithMathFunction = compileFunction(sourceCode, ["Math"], {
    parsingContext: context
  });
  compiledWithMathFunction(runInContext("Math", context));
})().catch(error => {
  console.error(error);
  process.exitCode = 1;
});

Running this gives the following results on my machine:

$ node --experimental-vm-modules test.js
(node:19958) ExperimentalWarning: VM Modules is an experimental feature. This feature could change at any time
default: 1.698s
default: 1.620s
default: 7.306ms

So ~1600ms if not using the workaround, which reduces it to ~7ms.

Describe the solution you'd like

I'd like for accessing globals to be as fast, or as close to as possible, as fast as accessing globals outside of vm. Would it be possible for Node's vm implementation to cache the global property lookups, so that the price is only paid once instead of on every single access?

Describe alternatives you've considered

As mentioned, caching and injecting the global manually works when there is a function wrapper, but this doesn't work with ESM. I've tried assigning the Math global to the context, and while that halves the time spent (~800ms), it's still 2 orders of magnitude slower than injecting a reference.

@addaleax addaleax added perf_hooks Issues and PRs related to the implementation of the Performance Timing API. vm Issues and PRs related to the vm subsystem. labels Feb 6, 2020
@devsnek
Copy link
Member

devsnek commented Feb 6, 2020

There is nothing node can do to help performance here. The problem is that V8's global proxy design forces the embedder to indirect through c++ for every lookup, which adds a lot of overhead.

@devsnek devsnek added v8 engine Issues and PRs related to the V8 dependency. and removed perf_hooks Issues and PRs related to the implementation of the Performance Timing API. labels Feb 6, 2020
@SimenB
Copy link
Member Author

SimenB commented Feb 6, 2020

And there's no way to cache/circumvent that lookup i node? If not, do you think there's any chance of v8 making changes so we can work around it?

Or are there any workarounds similar to the current workaround we can for SourceTextModule (or whatever the ESM vm API ends up being)?

@devsnek
Copy link
Member

devsnek commented Feb 6, 2020

@SimenB You could do const context = createContext(); context.console = console instead of const context = createContext({ console }).

In the long term V8 might need to refactor this system anyway because of the upcoming realms API, so maybe one day createContext({}) will be fast too.

@SimenB
Copy link
Member Author

SimenB commented Feb 6, 2020

@devsnek I mentioned that in the OP - it takes the time from 1600ms to 800ms (so halves the time), but it's still 2 orders of magnitude slower than injecting it (which is less than 10ms).

I see no difference between createContext({ someGlobal }) and createContext().someGlobal = someGlobal - both are twice as fast as not assigning the global at all, but still suuuuper slow compared to running outside of vm all together.

@devsnek
Copy link
Member

devsnek commented Feb 6, 2020

@SimenB it seems that at some point we started creating the internal proxy unconditionally, which is why you're not seeing a speedup with what i suggested.

@devsnek
Copy link
Member

devsnek commented Feb 6, 2020

Using the vm.Context constructor from #30709 also fixes this. Hopefully I can get that merged soon.

@SimenB
Copy link
Member Author

SimenB commented Feb 6, 2020

Ah, wonderful!

@addaleax addaleax added the performance Issues and PRs related to the performance of Node.js. label Feb 7, 2020
@SimenB
Copy link
Member Author

SimenB commented Feb 7, 2020

I just compiled your branch and can confirm it does indeed fix the issue 😀

🤞 you're able to land it! Might be a bit premature, but do you think it'll be backported to v10 and v12? Since it's a different API graceful degradation should be fine, but the free performance boost for all release line would be wonderful (and Jest could remove the option we added to allow people to work around this)

@devsnek devsnek mentioned this issue Feb 14, 2020
4 tasks
devsnek added a commit to devsnek/node that referenced this issue Feb 17, 2020
@juanarbol
Copy link
Member

Fixed on 4725ac6 :)

@juanarbol
Copy link
Member

Sorry, @SimenB is this issue solved?

@juanarbol juanarbol reopened this May 2, 2020
@SimenB
Copy link
Member Author

SimenB commented May 2, 2020

I don't think so, #30709 was closed unmerged?

@SimenB
Copy link
Member Author

SimenB commented May 2, 2020

Please land it, it would make me very happy 😀

@TimothyGu
Copy link
Member

Also note that this issue is known as far back as 2016: nodejs/benchmarking#75 (comment).

@thernstig
Copy link
Contributor

Considering it was 1,5 years ago #30709 was closed in favor of the realms API, has the realms API landed or should @devsnek's PR still be worked on?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
performance Issues and PRs related to the performance of Node.js. v8 engine Issues and PRs related to the V8 dependency. vm Issues and PRs related to the vm subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants