Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Results can be a bit misleading, as they show the sum of the executions #1

Open
leipert opened this issue Jul 17, 2019 · 4 comments
Open

Comments

@leipert
Copy link

leipert commented Jul 17, 2019

@mathiasbynens Thank you for the great work, I learned a lot:

  1. Inner workings of JSON vs JS literals in JS engines, and yes JSON parsing is faster because of the limited grammar compared to Objects.
  2. jsvu, didn't know about that one.

But I have a question regarding methodology. Currently timings are the aggregated results for the hundred runs, e.g.

  printf "Benchmarking JS literal on ${bin}";
  time (for i in {1..100}; do $bin out/js.js; done);

So that means, that the stats from the README:

  JS literal JSON.parse Speed-up
V8 v7.5 23.765 s 15.766 s 1.5×

could also read:

  JS literal JSON.parse Speed-up
V8 v7.5 237.765 ms 157.66 ms 1.5×

So it takes V8 v7.5 either 157 or 237 _milli_seconds for parsing a 8 megabyte file. This is an impressive feature, but one would save "only" 80ms vs the 8 seconds as the README suggests.

@leipert
Copy link
Author

leipert commented Jul 17, 2019

Also https://v8.dev/blog/cost-of-javascript-2019#json suggests:

A good rule of thumb is to apply this technique for objects of 10 kB or larger — but as always with performance advice, measure the actual impact before making any changes.

I tried to change the tests to 80kB where I got results around 23.5 ms for both, which seems to be the startup time of v8. (empty realm file yields same result).

How would one go and do a proper benchmark with such a small file?

@leipert
Copy link
Author

leipert commented Jul 17, 2019

I have now executed 2000 runs on an 80kb file, see #2

And I created a file ./out/empty.js with no content

as a baseline. I have only run it against: v8-7.5.288 only and it shows the following results:

Benchmarking empty on v8-7.5.288… 48.380
Benchmarking JS literal on v8-7.5.288… 54.265
Benchmarking JSON.parse on v8-7.5.288… 51.835

So to parse and execute, it takes 21.69 milliseconds, let's us that as a baseline. JS literal takes: 27.133 milliseconds and JSON.parse: 25.918 milliseconds. The pure parsing time should then roughly be:

  JS literal JSON.parse Speed-up
V8 v7.5 5.443 ms 4.228 ms 1.28×

So JSON.parse is still faster than the JS literal, closer to 1.3x. So parsing 80kB (roughly one jQuery) is round 1.2 milliseconds faster. Looking at that, I would probably not recommend to use this technique to save computing time in WEB bundles for example, might be different in node.JS where one's could server bill could be smaller if they use less computing.

@mathiasbynens
Copy link
Member

mathiasbynens commented Jul 18, 2019

So it takes V8 v7.5 either 157 or 237 _milli_seconds for parsing a 8 megabyte file. This is an impressive feature, but one would save "only" 80ms vs the 8 seconds as the README suggests.

I don't think the README suggests an 8-second improvement in any way. It explicitly points out how the measurements are taken (i.e. 100 d8 invocations).

Btw, I noticed that you're only testing in V8 v7.5. If you pick only a single V8 version to test in, it should probably be the latest one (currently v7.7). Note that V8 v7.6 had significant JSON.parse improvements.

@leipert
Copy link
Author

leipert commented Jul 24, 2019

I know it doesn't say that, but looking at the README, the first thing that jumps to the eye is the table. If I were to present data like this in another context, it would look odd as well:


Travel times from Leipzig -> Berlin (200km), benchmarked a 100 times:

Bob Anna Speed-Up
Opel Corsa 10000 minutes 8000 minutes 1.25x
Tesla 8000 minutes 6000 minutes 1.33x

vs

Bob Anna Speed-Up
Opel Corsa 100 minutes 80 minutes 1.25x
Tesla 80 minutes 60 minutes 1.33x

I guess, what I am saying is: I am just used that if you create a benchmark or statistic, you normalize it to the individual, you'd also never say: This group of a 100 people is 6720 years old vs this group who is 6540 years old 😉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants