You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ask
Is there any opportunity to improve heuristic used to patch up callstacks more reliably? Would reducing sampling frequency reduce the odds of callstacks being lost?
I do know that Py was able to patch up some of these mangled callstacks using android-specific heuristics, but it only applies to the main thread: https://github.com/pyricau/simpleperf-cleanup
Affected versions
r23, r24, r25, Canary
Canary version
No response
Host OS
Linux, Mac, Windows
Host OS version
n/a
Affected ABIs
armeabi-v7a, arm64-v8a, x86, x86_64
Build system
Other (specify below)
Other build system
n/a
minSdkVersion
21
Device API level
No response
The text was updated successfully, but these errors were encountered:
Description
Background
We are exploring differential flame graphs based off of simpleperf data for commit-over-commit performance comparisons. We noticed that frequently our callstacks will be mangled which breaks the ability to do comparisons. Here is an article that discusses this problem in more depth.
I've shared an example of simpleperf in the following ticket (it's in a private component Carmen Jackson can help give access): https://partnerissuetracker.corp.google.com/issues/265123009
Ask
Is there any opportunity to improve heuristic used to patch up callstacks more reliably? Would reducing sampling frequency reduce the odds of callstacks being lost?
I do know that Py was able to patch up some of these mangled callstacks using android-specific heuristics, but it only applies to the main thread: https://github.com/pyricau/simpleperf-cleanup
Affected versions
r23, r24, r25, Canary
Canary version
No response
Host OS
Linux, Mac, Windows
Host OS version
n/a
Affected ABIs
armeabi-v7a, arm64-v8a, x86, x86_64
Build system
Other (specify below)
Other build system
n/a
minSdkVersion
21
Device API level
No response
The text was updated successfully, but these errors were encountered: