-
Notifications
You must be signed in to change notification settings - Fork 10.4k
Re-use the Blazor app for stress runs #21078
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
ced80d6
to
1052677
Compare
src/Components/benchmarkapps/Wasm.Performance/TestApp/wwwroot/benchmarks/stress.js
Outdated
Show resolved
Hide resolved
src/Components/benchmarkapps/Wasm.Performance/TestApp/wwwroot/benchmarks/jsonHandlingStress.js
Show resolved
Hide resolved
src/Components/benchmarkapps/Wasm.Performance/TestApp/wwwroot/benchmarks/index.js
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems sensible so far. I can give it another review once it is out of draft mode.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't have a lot of context on this area, but it looks good to me!
Our current stress runs re-used the perf tests which recreated the blazor app on each run. Perf runs are meant to run and be done, however, we want stress apps to be long lasting to capture things like memory leaks. This change creates a fork in the tests to support stress runs that re-use the app between runs.
1052677
to
adb5906
Compare
...nts/WebAssembly/DebugProxy/src/Microsoft.AspNetCore.Components.WebAssembly.DebugProxy.csproj
Show resolved
Hide resolved
This is updated and ready to go |
Name = "blazorwasm/download-size", | ||
Value = ((float)benchmarkResult.DownloadSize) / 1024, | ||
}); | ||
output.Metadata.Add(new BenchmarkMetadata |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sebastienros this should include WASM's GC reported memory usage. I'm using GC.GetTotalMemory
- if there's a better statistic (or more stats) we could use, let me know.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's wasm, I have no idea what it will give you. I know that on the server we use WorkingSet64 or the Process.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The .NET memory usage statistic is good and useful, but is only part of the story. We also use memory on the JS side too which we want to be sure isn't leaking.
Might be worth also reporting the complete memory usage via https://stackoverflow.com/a/37010923. Note the requirement to pass --enable-precise-memory-info
to the browser when launching it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done:
{
"Measurements": [{
"Timestamp": "2020-05-01T21:30:28.430422Z",
"Name": "blazorwasm/wasm-memory",
"Value": 9930.648
}, {
"Timestamp": "2020-05-01T21:30:28.4304227Z",
"Name": "blazorwasm/js-usedjsheapsize",
"Value": 20500000
}, {
"Timestamp": "2020-05-01T21:30:28.4304235Z",
"Name": "blazorwasm/js-totaljsheapsize",
"Value": 50400000
}, {
"Timestamp": "2020-05-01T21:30:28.4304237Z",
"Name": "blazorwasm/commit",
"Value": "adb59063c32bf8a4a95e336c7b4f2a66d3016d6d"
}, {
"Timestamp": "2020-05-01T21:30:28.4304827Z",
"Name": "blazorwasm/render-10-items",
"Value": 8.169897957895026
}, {
"Timestamp": "2020-05-01T21:30:28.4304829Z",
"Name": "blazorwasm/render-100-items",
"Value": 41.96799999335781
}, {
"Timestamp": "2020-05-01T21:30:28.430483Z",
"Name": "blazorwasm/render-1000-items",
"Value": 382.26500002201647
}, {
"Timestamp": "2020-05-01T21:30:28.4304831Z",
"Name": "blazorwasm/jsonserialize-1kb",
"Value": 4.702906975750054
}, {
"Timestamp": "2020-05-01T21:30:28.4304833Z",
"Name": "blazorwasm/jsonserialize-340kb",
"Value": 540.8199999947101
}, {
"Timestamp": "2020-05-01T21:30:28.4304834Z",
"Name": "blazorwasm/jsondeserialize-1kb",
"Value": 6.882711864171266
}, {
"Timestamp": "2020-05-01T21:30:28.4304835Z",
"Name": "blazorwasm/jsondeserialize-340kb",
"Value": 1093.1250000139698
}, {
"Timestamp": "2020-05-01T21:30:28.4304836Z",
"Name": "blazorwasm/orgchart-3-3-org",
"Value": 256.580000044778
}, {
"Timestamp": "2020-05-01T21:30:28.4304837Z",
"Name": "blazorwasm/edit-orgchart-3-2",
"Value": 152.43666669509062
}, {
"Timestamp": "2020-05-01T21:30:28.430484Z",
"Name": "$$Delimiter$$",
"Value": null
}]
}
According to the docs, the sizes reported by performance.memory
are not specifically in bytes, just units that can be used for relative comparison.
|
||
constructor() { | ||
if (BlazorStressApp.instance) { | ||
return BlazorStressApp.instance; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is pretty weird. Could we change the other code that calls new BlazorStressApp()
to use BlazorStressApp.instance
instead? Otherwise when you're reading other code, you get the impression that you're getting separate instances, and wouldn't guess you were actually sharing a singleton.
Then this special constructor logic could be removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
Our current stress runs re-used the perf tests which recreated the blazor app on each run. Perf runs are meant to run
and be done, however, we want stress apps to be long lasting to capture things like memory leaks.
This change creates a fork in the tests to support stress runs that re-use the app between runs.
Note that I haven't included changes to the perf \ stress driver as yet in this change.