Skip to content

joyeecheung/benchmarking

 
 

Repository files navigation

Benchmarking Work Group

Mandate

The Benchmark working group's purpose is to gain consensus for an agreed set of benchmarks that can be used to:

  1. track and evangelize performance gains made between Node releases
  2. avoid performance regressions between releases

Its responsibilities are:

  1. Identify 1 or more benchmarks that reflect customer usage. Likely need more than one to cover typical Node use cases including low-latency and high concurrency
  2. Work to get community consensus on the list chosen
  3. Add regular execution of chosen benchmarks to Node builds
  4. Track/publicize performance between builds/releases

The path forward is to:

See here for information about the infrastructure in place so far: https://github.com/nodejs/benchmarking/blob/master/benchmarks/README.md

Current Project Team Members

  • Michael Dawson (@mhdawson) Facilitaor
  • Trevor Norris (@trevnorris)
  • Ali Sheikh (@ofrobots)
  • Yosuke Furukawa (@yosuke-furukawa)
  • Yunong Xiao (@yunong)
  • Mark Leitch (@m-leitch)
  • Surya V Duggirala (@suryadu)
  • Uttam Pawar (@uttampawar)
  • Michael Paulson (@michaelbpaulson)
  • Gareth Ellis (@gareth-ellis)
  • Wayne Andrews (@CurryKitten)
  • Kyle Farnung (@kfarnung)
  • Kunal Pathak (@kunalspathak)
  • Benedikt Meurer (@bmeurer)

About

Node.js Benchmarking Working Group

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Shell 56.0%
  • JavaScript 32.0%
  • HTML 8.6%
  • Awk 3.4%