Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Profiling cycle #900

Closed
omarabid opened this issue Mar 1, 2022 · 0 comments
Closed

Profiling cycle #900

omarabid opened this issue Mar 1, 2022 · 0 comments

Comments

@omarabid
Copy link
Contributor

omarabid commented Mar 1, 2022

@drahnr commented on Fri Feb 25 2022

Currently all profiles are gathered only 10s and this is hardcoded.

In pyroscope we see the same pattern while the sampling period is 113 Hz
image

I would have expected to see a full timeline with the 113 bars in the span of 1 second, rather than the average over 10s with a sample rate of 113 Hz.

Is this intended?


@omarabid commented on Sat Feb 26 2022

@drahnr I think yes. To be honest, I'm not quite sure what these bars mean. @petethepig might help here?


@petethepig commented on Sat Feb 26 2022

@drahnr Yeah, this is intended. The idea with the timeline is that it gives you a rough idea of where CPU time was spent. For example, on the screenshot you linked the second bar is little higher — that means CPU was a little more active at that point.

This is due to the fact that when we upload data to pyroscope we aggregate all samples in 10-second intervals and as a result lose some precision.


@drahnr commented on Mon Feb 28 2022

My expectation was, that the data sampled is compressed, but not coalesced. The backend does provide the same functionality already, so I don't see the rationale to do this on the client side? The amount of data sent is small~ish afaiu.

For our use case, CPU usage highly depends on a certain stage, so a sample period of 10 seconds is significantly too long to see meaningful items in the flamegraphs since the majority is consumed be async related helper functions, it's also too long to see spikes due to certain functionality.

I'd be interested in making this configurable and I have a PR ready.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant