Skip to content

Latest commit

 

History

History
173 lines (141 loc) · 11.9 KB

2018-01-31-V8.md

File metadata and controls

173 lines (141 loc) · 11.9 KB

Meeting with V8, 2018-01-31

Links

Present

TSC

V8

Other

Summary

Performance of async hooks performance (0:10)

  • Facing up to 30% performance regression (on hapi17) when enabling async hooks.
  • Koa is not as bad.
  • Core design goal was not to impact performance if not enabled, but performance impact when enabled is expected.
  • Differing opinions whether this is a blocker.
  • To alleviate performance regression we likely need to redesign, e.g. having a way to monkey-patch Promise construction in async call. That would also work in the browser. That would need more time and also spec work.
  • Currently observed performance regression may be more than users would expect, compared to existing mechanisms. 30% sounds like too much for always-on in production.
  • Domains for context tracking are based on async hooks.
  • Maybe we should not move this out of experimental just yet if we expect breaking changes in the future to fix performance. Maybe port async hooks to JavaScript.
  • Making async calls interceptable may not be a good idea from a standards perspective.
  • This may not only affect async/await and Promises, but also timers, etc.
  • Native modules also tie into async hooks.
  • Promise hooks store data in a TypedArray to make it quickly accessible from both JS and C++.
  • Destroy hook is one of the big issues. It makes init hook expensive and is often not called timely. Most use case is for verification, to free memory and to catch unhandled Promises. But latter is not reliable because it relies on GC.
  • If destroy hook is really that important maybe V8 should track it natively.
  • Async hooks performance will become worse as developers adopt async/await. We need to find a solution.
  • Only part of the problem is the boundary crossing between JS and C++. Other part is weak handle on the promise and the destroy hook.
  • We landed a CL in V8 to reduce Promise size, fewer GCs, better performance. Large allocation in addition to holding onto Promises works against the generational hypothesis for the GC. Having objects survive into old space makes GC more expensive.
  • Are other async hooks aside from Promise bottlenecks? No specific numbers, but experience are ~10%. Mostly from calling nextTicks. 10% might be acceptable for production.
  • 10% should be the target for Promise and async/await hooks too, to become competitive.
  • Existing APM solutions do not work with async/await.
  • Matteo will create a write-up.

Update on GYP/GN (19:50)

  • We will remove gyp files from V8 in 6.6.
  • Node will need to inherit them.
  • Currently moving gyp files into a central folder.
  • On V8’s fork of Node, we can ./configure --build-v8-with-gn to fetch dependencies and build with GN by invoking make. This works with Mac and Linux.
  • Not designed for default build. More a crutch for the V8 team to test V8 against Node.js on CI.
  • Node Canary branch is maintained automatically. Script to do that need to be changed.
  • Gyp configurations will need to be maintained by Node. Usually changes only affect file lists.
  • File changes can be checked via v8/gypfiles/verify_source_deps.py or via gn desc.
  • gn desc shows the source list, includes, compile flags etc.
  • Changes to build steps expected maybe once a year. That would require to translate GN rules to gyp rules.
  • Next change to build steps probably affecting mksnapshot to implement putting V8-generated builtins into the binary, in the next 2-3 months. Change probably minor though.
  • For day-to-day test with Canary we could either automate updating file lists or use --build-v8-with-gn. In case of the latter we will lose nightly builds for some platforms.
  • GN bridge does not yet work for Windows, but V8 wants to work on this. Also not going to work for AIX or FreeBSD.

General perception of Promise performance (27:10)

  • What are general issues with Promises?
  • Allocation causing bad GC performance as mentioned previously. Work being done here.
  • Default Promise constructor runs synchronously. FS Promises need to propagate exceptions asynchronously. So the Promise is actually created in a Promise resolver, making this two Promises. Maybe a way to make this faster?

Status of workers (29:00)

  • Anna Henningsen would know best. Sync with her directly makes most sense.
  • SharedArrayBuffers are going to be disabled in the short-term future. The current goal is to re-enable them "at some point".

V8 extras in Node.js (31:00)

  • Originally designed to expose V8 internal features to JavaScript so that Chrome can use them to implement web platform features, such as streams.
  • Test coverage is not as good as it could be.
  • Stagnated since the initial implementation.
  • Please do not use. It’s not part of the standard JavaScript language.
  • Open PR to expose V8 extras to Node.js to be able to use private symbols.
  • Private symbols could still leak if not used carefully.
  • We are trying to remove features from V8 extras. Having an additional user would work counter to this.
  • V8 extras should be considered a hidden API.
  • WeakMap can be an alternative to private symbols. Private class fields in the future.
  • V8 extras do not have debugging features. It contains private symbols, fast paths for Promise constructor, Internal Arrays. Dangerous to use without context.
  • Natives syntax will be phased out, since V8 is moving builtins away from JavaScript implementation. Will still have it for testing purposes.
  • Conclusion: do not use.

Memory tooling (37:00)

  • Trying to improve memory tooling to debug memory leaks.
  • Heap snapshot is not very usable.
  • Maybe add source position to retaining paths when it was created. But not useful for production.
  • Convert core dumps to heap snapshot would be another idea. Netflix is working on a prototype based on llnode. DevTools UI is a lot nicer to use than raw heap snapshot.
  • Allocation profiling provides allocation source positions. Is not accurate wrt inlined functions yet. Allocation sites may also not be helpful.
  • Maybe alternative format to JSON for heap snapshot. But some other work preceding that.
  • Changing maximum string size for heap snapshot would be useful and should be possible.
  • Memory tooling for the case where we are not observing a leak, but allocation pattern that cause bad GC performance? E.g. when short-living objects survive to old space. This could be solved by increasing new space size.
  • Async patterns could cause issues for the GC by doing too much work for marking.
  • Repro case would be great to work with the GC team. Maybe there are tools that the GC team already uses. Matteo to provide some data.

GCC/MSVC support in V8 (47:00)

  • Presentation on V8 and Clang by Michael Lippautz.
  • V8 currently supports GCC, Clang, and MSVC.
  • Chrome is moving towards only supporting Clang.
  • Test coverage for GCC and MSVC is growing thin.
  • Clang supports plugins, which can be very useful to catch bugs.
  • V8 uses handles to track garbage collected stack slots.
  • If we create a stack map using a Clang plugin, we would not need to use handles.
  • That would result in better performance (estimated up to 10%), especially for crossing the JS/C++ boundary.
  • That requires V8 to be built with Clang.
  • Native addon modules would not be affected since V8’s API is not affected.
  • Clang is ABI compatible to GCC, so linking Clang compiled binary to GCC compiled ones works (Node core and native addons).
  • Clang might be problematic for AIX. Muntasir to take a look.
  • MSVC does not support .incbin assembly directive, which we want to use for moving builtins to the binary to reduce V8’s heap size.
  • Clang used to produce worse code than MSVC, but that’s resolved now.
  • Clang also seems to produce better code than GCC.
  • When is Clang requirement going to happen? No roadmap yet. Currently very early phase. Experiments to implement Clang plugin for verifying handles would be the first step.

Snapshot support in Node.js (59:10)

  • Node issue to implement custom startup snapshot for Node.js. Estimated speed up for startup is 4x.
  • Alibaba picked this up. Experiment confirms 4x.
  • Node.js startup runs spaghetti code that heavily relies on command line flags and env vars.
  • Clean up necessary to make initialization cleaner.
  • Goal: put as much as possible into a phase that is deterministic and does not depend on command line flags and env vars.
  • Allocating C++ objects might be tricky, since we would need to allocate them and hook them up correctly when deserializing from snapshot too.
  • Process object setup needs to be isolated and refactored.
  • Native bindings need to be listed and passed to V8.
  • Worth doing due to huge performance win. Alibaba might be interested in this because they may be planning to use Node.js as application runtime on mobile.
  • Once this works, we can extend this to user code. Several projects including webpack are very interested in this. This needs a container format to support native modules.

Spectre and Meltdown (1:06:10)

  • Node.js not directly affected, but OS need to be patched.
  • V8 announcement. Runtime flag to disable mitigations (--no-untrusted-code-mitigations) in favor of performance.
  • --no-untrusted-code-mitigations to avoid performance impact from mitigations that V8 is implementing for Chrome.
  • Test coverage for both with and without mitigations.
  • Chrome is not going to provide test coverage in the wild for --no-untrusted-code-mitigations.
  • Performance penalty probably upwards of 10%.

Trace events (1:09:00)

  • Make trace events available for JS with low overhead?
  • Definitely interest, but not super critical.
  • Node.js has several instrumentation APIs. Needs consolidation.
  • V8 could provide a JS API to trigger trace events that can be inlined into optimized code.
  • Need some more thought here both in V8 and Node.js.

REPL mode for V8 parser (1:12:30)

  • DevTools console and Node.js REPL execute code a bit differently than regular JavaScript.
  • Expectations are not entirely clear.
  • Implementations are hacky, e.g. to support top-level await.
  • Maybe should be supported in V8’s parser.
  • Specifying this would be great. Otherwise V8 would be implementing unspec’ed behavior.
  • Topic for next TC39?