Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Proposal] Async stack traces. #340

Conversation

adriancooney
Copy link

@adriancooney adriancooney commented Apr 5, 2018

This PR introduces async stack traces to Backburner so it's easier to debug the runloop by allowing the developer to step back through the asynchronous stack frame.

Before / After Stack Trace for this demo code

before-after-backburner-stack-trace

gif of stepping though demo call stack

After some cursory research, it doesn't look like there's an API for custom async stack traces available in Chrome. It was actually requested from what looks like an Ember developer however that was back in 2014 and there hasn't been much activity on it since. This only left the hacky approach of emulating what Chrome thinks is an async action via Promises. I've created a new method on the Backburner class called _asyncExecute that takes a method and returns a new Promise's resolve method (not the promise itself!) synchronously. When that returned resolve method is called by Backburner (as if it was the method itself), the promise chain advances and calls the original method passed to _asyncExecute. This tricks the Dev Tools into thinking some asynchronous action happened when in fact it's just a simple callback.

Thankfully we only need to patch two methods, schedule and later. later doesn't entirely need to be patched because Chrome can accurately represent it's async operation as it's a setTimeout however it also gets scheduled into the default queue for execution once the timeout completes which is two async hops. With patching later, we reduce this to one async hop for clarity. All of this of course only happens when Backburner#DEBUG is true. Finally, there is no breaking API changes.

The method is a little hacky and the Promise.then (async) in between async stack frames in the sidebar Call Stack (see gif) is a little misleading but I'd consider this a minor setback for an increase to debuggability and productivity. One of my biggest gripes with Ember is how frustrating it is to debug the runloop.

I expected a performance hit to Backburner in debug mode because of the promises ontop of the current stack stitching system but to my surprise, there's barely any. I suspect it's the new Error call that is causing such a performance hit between non-debug and debug modes. When Firefox and Safari (and Edge?) supports async stack traces, maybe we could remove that call altogether and get DEBUG performance to a reasonable level?

  Benchmark                                                                            |     master      |      PR
  ------------------------------------------------------------------------------------------------------------------------
  DEBUG - Schedule & Flush - function ................................................ | 126,951.07 op/s | 112,396.69 op/s
  DEBUG - Schedule & Flush - target, function ........................................ | 126,113.43 op/s | 118,414.02 op/s
  DEBUG - Schedule & Flush - target, string method name .............................. | 130,435.87 op/s | 125,710.38 op/s
  DEBUG - Schedule & Flush - target, string method name, 1 argument .................. | 133,912.92 op/s | 128,377.54 op/s
  DEBUG - Schedule & Flush - target, string method name, 2 arguments ................. | 132,708.16 op/s | 129,048.85 op/s
`node bench` on master

[backburner.js] (master) $ node bench
testing
- Debounce - function
- Debounce & Cancel - function, target
- Later & Cancel - function
- Later & Cancel - function, target
- Schedule & Cancel - function
- Schedule & Cancel - target, function
- Schedule & Cancel - target, string method name
- Schedule & Cancel - target, string method name, 1 argument
- Schedule & Cancel - target, string method name, 2 arguments
- Schedule & Cancel - prescheduled, same queue - target, string method name
- Schedule & Cancel - prescheduled, separate queue - target, string method name
- Schedule & Flush - function
- Schedule & Flush - target, function
- Schedule & Flush - target, string method name
- Schedule & Flush - target, string method name, 1 argument
- Schedule & Flush - target, string method name, 2 arguments
- DEBUG - Schedule & Flush - function
- DEBUG - Schedule & Flush - target, function
- DEBUG - Schedule & Flush - target, string method name
- DEBUG - Schedule & Flush - target, string method name, 1 argument
- DEBUG - Schedule & Flush - target, string method name, 2 arguments
- Throttle - function
- Throttle & Cancel - function, target
running first test, please wait...
  Debounce - function ................................................................ 324,055.10 op/s
  Debounce & Cancel - function, target ............................................... 291,506.86 op/s
  Later & Cancel - function .......................................................... 333,466.75 op/s
  Later & Cancel - function, target .................................................. 330,913.76 op/s
  Schedule & Cancel - function ..................................................... 4,526,880.55 op/s
  Schedule & Cancel - target, function ............................................. 2,215,570.12 op/s
  Schedule & Cancel - target, string method name ................................... 2,166,164.74 op/s
  Schedule & Cancel - target, string method name, 1 argument ....................... 2,008,147.17 op/s
  Schedule & Cancel - target, string method name, 2 arguments ...................... 1,941,816.72 op/s
  Schedule & Cancel - prescheduled, same queue - target, string method name ........ 1,187,647.98 op/s
  Schedule & Cancel - prescheduled, separate queue - target, string method name .... 2,103,827.47 op/s
  Schedule & Flush - function ........................................................ 512,620.24 op/s
  Schedule & Flush - target, function ................................................ 497,121.32 op/s
  Schedule & Flush - target, string method name ...................................... 484,099.56 op/s
  Schedule & Flush - target, string method name, 1 argument .......................... 485,970.45 op/s
  Schedule & Flush - target, string method name, 2 arguments ......................... 480,963.27 op/s
  DEBUG - Schedule & Flush - function ................................................ 126,951.07 op/s
  DEBUG - Schedule & Flush - target, function ........................................ 126,113.43 op/s
  DEBUG - Schedule & Flush - target, string method name .............................. 130,435.87 op/s
  DEBUG - Schedule & Flush - target, string method name, 1 argument .................. 133,912.92 op/s
  DEBUG - Schedule & Flush - target, string method name, 2 arguments ................. 132,708.16 op/s
  Throttle - function .............................................................. 4,038,990.57 op/s
  Throttle & Cancel - function, target ............................................. 1,778,177.93 op/s
fastest: Schedule & Cancel - function

`node bench` on this PR

[backburner.js] (adrian.cooney/async-stack-traces) $ node bench
testing
- Debounce - function
- Debounce & Cancel - function, target
- Later & Cancel - function
- Later & Cancel - function, target
- Schedule & Cancel - function
- Schedule & Cancel - target, function
- Schedule & Cancel - target, string method name
- Schedule & Cancel - target, string method name, 1 argument
- Schedule & Cancel - target, string method name, 2 arguments
- Schedule & Cancel - prescheduled, same queue - target, string method name
- Schedule & Cancel - prescheduled, separate queue - target, string method name
- Schedule & Flush - function
- Schedule & Flush - target, function
- Schedule & Flush - target, string method name
- Schedule & Flush - target, string method name, 1 argument
- Schedule & Flush - target, string method name, 2 arguments
- DEBUG - Schedule & Flush - function
- DEBUG - Schedule & Flush - target, function
- DEBUG - Schedule & Flush - target, string method name
- DEBUG - Schedule & Flush - target, string method name, 1 argument
- DEBUG - Schedule & Flush - target, string method name, 2 arguments
- Throttle - function
- Throttle & Cancel - function, target
running first test, please wait...
  Debounce - function ................................................................ 337,010.46 op/s
  Debounce & Cancel - function, target ............................................... 293,808.65 op/s
  Later & Cancel - function .......................................................... 344,887.98 op/s
  Later & Cancel - function, target .................................................. 328,991.84 op/s
  Schedule & Cancel - function ..................................................... 4,501,900.81 op/s
  Schedule & Cancel - target, function ............................................. 2,206,946.83 op/s
  Schedule & Cancel - target, string method name ................................... 2,129,333.73 op/s
  Schedule & Cancel - target, string method name, 1 argument ....................... 1,800,934.46 op/s
  Schedule & Cancel - target, string method name, 2 arguments ...................... 1,745,172.19 op/s
  Schedule & Cancel - prescheduled, same queue - target, string method name ........ 1,130,978.88 op/s
  Schedule & Cancel - prescheduled, separate queue - target, string method name .... 1,890,185.90 op/s
  Schedule & Flush - function ........................................................ 480,524.04 op/s
  Schedule & Flush - target, function ................................................ 491,834.57 op/s
  Schedule & Flush - target, string method name ...................................... 496,861.43 op/s
  Schedule & Flush - target, string method name, 1 argument .......................... 482,918.45 op/s
  Schedule & Flush - target, string method name, 2 arguments ......................... 476,500.03 op/s
  DEBUG - Schedule & Flush - function ................................................ 112,396.69 op/s
  DEBUG - Schedule & Flush - target, function ........................................ 118,414.02 op/s
  DEBUG - Schedule & Flush - target, string method name .............................. 125,710.38 op/s
  DEBUG - Schedule & Flush - target, string method name, 1 argument .................. 128,377.54 op/s
  DEBUG - Schedule & Flush - target, string method name, 2 arguments ................. 129,048.85 op/s
  Throttle - function .............................................................. 3,968,894.84 op/s
  Throttle & Cancel - function, target ............................................. 1,743,115.59 op/s
fastest: Schedule & Cancel - function


I've stuck up a page on Github that gives you a demo. Be sure to open Dev Tools and stick a breakpoint in done.

http://adriancooney.ie/backburner.js/dist/demo/

Todo:

  • Write some extra tests. Suggestions welcome because the effects lie in the Dev Tools.

@adriancooney adriancooney changed the title Async stack traces. [Proposal] Async stack traces. Apr 5, 2018
@adriancooney adriancooney force-pushed the adrian.cooney/async-stack-traces branch from 47e3944 to 8517330 Compare April 5, 2018 11:24
Copy link
Collaborator

@rwjblue rwjblue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for working on this problem! The one thing that strikes me here is that we may lose some of the ordering guarantees that we have today... 🤔

@krisselden / @stefanpenner / @ef4 - Thoughts?

lib/index.ts Outdated
.catch((err) => {
let onError = getOnError(this.options);

if (onError) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The else case should throw err here (or we swallow any/all errors when there is not an Ember.onerror set).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, nicely spotted! I'll update the PR to throw the error.

lib/index.ts Outdated
let resolve;

new Promise((_resolve) => resolve = _resolve)
.then(method.bind(target, ...args))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm unsure if binding here is any better/faster than .then(() => method.apply(target, args)), might be worth testing...

Copy link
Author

@adriancooney adriancooney Apr 5, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, your suggestion is much faster: https://jsperf.com/bind-creation-vs-closure-creation/1

I'll add that to the PR.

Edit: Ran the benchmarks again and there seems to be some improvement in performance (against master too!). Nice!

The new benchmarks (I've updated the original post):

    Benchmark                                                                            |     master      |      PR
  ------------------------------------------------------------------------------------------------------------------------
  DEBUG - Schedule & Flush - function ................................................ | 126,951.07 op/s | 139,381.79 op/s
  DEBUG - Schedule & Flush - target, function ........................................ | 126,113.43 op/s | 130,448.98 op/s
  DEBUG - Schedule & Flush - target, string method name .............................. | 130,435.87 op/s | 128,308.48 op/s
  DEBUG - Schedule & Flush - target, string method name, 1 argument .................. | 133,912.92 op/s | 125,822.98 op/s
  DEBUG - Schedule & Flush - target, string method name, 2 arguments ................. | 132,708.16 op/s | 122,860.41 op/s

Old benchmarks for posterity:

  Benchmark                                                                            |     master      |      PR
  ------------------------------------------------------------------------------------------------------------------------
  DEBUG - Schedule & Flush - function ................................................ | 126,951.07 op/s | 112,396.69 op/s
  DEBUG - Schedule & Flush - target, function ........................................ | 126,113.43 op/s | 118,414.02 op/s
  DEBUG - Schedule & Flush - target, string method name .............................. | 130,435.87 op/s | 125,710.38 op/s
  DEBUG - Schedule & Flush - target, string method name, 1 argument .................. | 133,912.92 op/s | 128,377.54 op/s
  DEBUG - Schedule & Flush - target, string method name, 2 arguments ................. | 132,708.16 op/s | 129,048.85 op/s

@adriancooney
Copy link
Author

adriancooney commented Apr 5, 2018

Thanks for the review @rwjblue! Can you elaborate on your thinking with the ordering guarantees? I had some suspicions myself since Promise execution isn't sync but the tests passed so I figured it might have been alright. Can you think of any ideas on how to design a test case for it?

@ef4
Copy link
Contributor

ef4 commented Apr 5, 2018

I don't think this will break anything as long as Promise.then is using proper microtask timing. But I don't think all our supported browsers have correct microtask timing when you use Promise.then. That's why backburner itself is using MutationObserver instead when it needs a microtask wait.

Does dev tools stitch together async frames across backburner's platform.next? If so, that would be great and you could just use that directly, and we'll know we have good timing on all browsers.

I also worry that it's risky to change the timing behavior when the debug flag is on vs off. It seems possible that merely turning on DEBUG would make your bug disappear, or make new bugs appear. For example, this change is enough to alter the behavior of a bug like #332. I would feel better if we just confirmed that there's no performance hit and made this the permanent behavior. I would not expect a performance hit as long as these are native microtasks.

@rwjblue
Copy link
Collaborator

rwjblue commented Apr 6, 2018

I had some suspicions myself since Promise execution isn't sync but the tests passed so I figured it might have been alright.

Tests do not currently run with DEBUG on and off (we should fix that though)...

@rwjblue
Copy link
Collaborator

rwjblue commented Apr 6, 2018

I also worry that it's risky to change the timing behavior when the debug flag is on vs off. It seems possible that merely turning on DEBUG would make your bug disappear, or make new bugs appear. For example, this change is enough to alter the behavior of a bug like #332. I would feel better if we just confirmed that there's no performance hit and made this the permanent behavior. I would not expect a performance hit as long as these are native microtasks.

In general, I agree. I think the issue here would be that I don't think we can change schedule to always be async (we can only do it if we are being flushed by an autorun)...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants