-
Notifications
You must be signed in to change notification settings - Fork 486
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sentry has ~8% overhead in Django test suites #668
Comments
I'm "the other engineer at another company". 😄 |
Definetly want to investigate this but I can't yet promise a fix because I don't know how pathological the slow codepath you're running into is. I think a good start would be to figure out which test slows down the most so we can go from observed behavior of an entire testsuite to some sort of microbenchmark. Pytest has a |
|
Sorry it seems that this already shows the entire flamegraph for |
Yes that's the flame graph for it. I can't show more sorry. I guess you can maybe eliminate query recording if you won't be sending the results anywhere? |
Ok. Can you comment out the |
I'm afraid I don't have time to try this. I hope you take your own profiles from a representative application. |
@untitaker Commeting I have tried this on a django test suite with ~500 tests. Results of 4 test runs average.
Also, is there a way to estimate the approximate overhead of sentry profiling in production projects? |
Thanks @ChillarAnand that confirms the suspicion. I hope we can free resources internally to work on this.
this generally depends on the kind of web framework and what kind of extensions you are using -- 8% is definetly on the upper end, also because we hook into a lot of Django. Integrations for AIOHTTP, Sanic and Flask capture much less data and as such the overhead will be lower. However, if you install a lot of Flask extensions you may get close to the same overhead as Django. |
Updating this old thread. There is no official collection of numbers yet but I can update this thread again when we do. |
In general it's very hard to put a specific number on the SDK's overhead, it really depends on the app. I'll close this for now but feel free to reopen or create a new issue if the overhead of Sentry on your specific app seems too high. |
Apologies for a somewhat vague report, I'm happy to expand this once it's decided what the appropriate course of action is.
We've found that disabling Sentry (not calling
init
) in tests saves around 8% of test time on a large Django codebase. This result has been replicated by an engineer at another company with a different Django codebase.It could be that this is just the expected overhead, in which case I think documenting this would be great. Some advice or comment in that documentation around whether this is worth it would be great as well – is it worth 8% to ensure that Sentry doesn't interact badly with the rest of the codebase, or is Sentry reliable and isolated enough that it's unlikely to catch any issues and the 8% time saving is more important.
I wouldn't be surprised if the overhead is not expected in typical production use, and that tests are a weird case (they throw a lot of handled exceptions for example).
Alternatively, it could be that this overhead is not expected and that this is a performance issue that Sentry would like to address. If so I'm happy to provide data from our test suite if you can point me towards what would be useful for you.
The text was updated successfully, but these errors were encountered: