Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Startup performance regression during June #101685

Closed
jrieken opened this issue Jul 3, 2020 · 4 comments
Closed

Startup performance regression during June #101685

jrieken opened this issue Jul 3, 2020 · 4 comments
Assignees
Labels
candidate Issue identified as probable candidate for fixing in the next release important Issue identified as high-priority perf-startup
Milestone

Comments

@jrieken
Copy link
Member

jrieken commented Jul 3, 2020

The windows perf bot shows a slight regression during the June iteration. For instance version 376d9d9d785ccca128fdbb16f001446d0ad64d32 is fastest 1862ms whereas version 8cd3fe080bee38d28ba01d0d71713d6ec22b254e is faster with 1666ms. There is surely some 10% fluctuation but there might be more to it. I have measured both versions by hand and code loading shows a slight slowdown (~50ms) tho code size our increased a little (~50kb).

One theory is that we now load all "init scripts" via script tags and not via nodejs anymore and the network-tag shows some request stalling. We should investigate and understand that while we also drill into other potential causes.

Screenshot 2020-07-03 at 16 59 49

@bpasero
Copy link
Member

bpasero commented Jul 3, 2020

Well, we always did load workbench.js via script tag, even before.

One idea that circulated was to concatenate the scripts we need on startup into one before minification and thus only have one script tag instead of 4. This would mean we need a workbench-dev.html and workbench.html for built versions similar to what we have for web too.

@bpasero
Copy link
Member

bpasero commented Jul 3, 2020

Also adding @deepak1556 for more insights if he is aware of script loading is slow compared to node.js require.

@jrieken jrieken added this to the June 2020 milestone Jul 3, 2020
@jrieken jrieken added candidate Issue identified as probable candidate for fixing in the next release important Issue identified as high-priority labels Jul 3, 2020
@jrieken
Copy link
Member Author

jrieken commented Jul 3, 2020

Screenshot 2020-07-03 at 17 54 23

There seems to be a severe problem with cached data. We seem to generate a lot less with recent insiders and that has a severe impact on code loading. With 8cd3 it took ~260ms and cached data from the main bundle was 8.9MB, with 376d it takes ~460ms and cached data for the main bundle is only 2.9MB.

The theory is that we generate cached data in multiple rounds. The more of the code has been executed the more cached data is created. This process stops once no increase in cached data is detected. Something there must have stopped working.

@jrieken
Copy link
Member Author

jrieken commented Jul 3, 2020

Screenshot 2020-07-03 at 18 39 56

Ok, turns out that subsequent calls to createCachedData can return less data than before. Unsure why but I can see how our cached-data-create loop ends up overwriting more data with less data. Unless during the same session more data is generated it will never recover from that. A simple change would be to adjust this loop check for more data: https://github.com/microsoft/vscode-loader/blob/2d55bd6cdc71d6daccc02e10f2e24b88dcb09bb8/src/core/scriptLoader.ts#L453

However, this isn't a regression. I can see the same behaviour with the insider versions from above as well as with stable. That doesn't make this less severe tho...

jrieken added a commit to microsoft/vscode-loader that referenced this issue Jul 6, 2020
jrieken added a commit that referenced this issue Jul 6, 2020
gjsjohnmurray pushed a commit to gjsjohnmurray/vscode that referenced this issue Jul 6, 2020
@jrieken jrieken closed this as completed Jul 6, 2020
@github-actions github-actions bot locked and limited conversation to collaborators Aug 20, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
candidate Issue identified as probable candidate for fixing in the next release important Issue identified as high-priority perf-startup
Projects
None yet
Development

No branches or pull requests

4 participants