Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increased length of calls after upgrading Sentry #2551

Closed
DarrenStack opened this issue Nov 30, 2023 · 11 comments · Fixed by #2593
Closed

Increased length of calls after upgrading Sentry #2551

DarrenStack opened this issue Nov 30, 2023 · 11 comments · Fixed by #2593
Assignees
Labels
Integration: Django Type: Bug Something isn't working

Comments

@DarrenStack
Copy link

How do you use Sentry?

Self-hosted/on-premise

Version

1.33.0

Steps to Reproduce

We recently updated out Sentry SDK of our Django based product to 1.33.0 and shortly after we saw that our DB calls seem to be taking noticeably longer constantly. Everything still works fine so there's no obvious error. The same application with downgraded Sentry runs with faster calls consistently.

We are running Django 3.2.23 and the Django integration in sentry but even removing that integration has not made a difference in query time length. I have also disabled tracing with enable_tracing set to False which did not change anything.

I've looked at the changelog for 1.33.0 but can't see anything obvious that would cause this. Our application and DB are running in the same cloud services provider. Any ideas? Have only seen it for DB calls so far but not ruling out that it could be for any operation. Thanks in advance!

Expected Result

Upgraded Sentry would not effect the length of calls in the application.

Actual Result

Attached are sample performance metrics for a test set of calls we made that shows the difference between versions.

image
image

@sentrivana
Copy link
Contributor

sentrivana commented Nov 30, 2023

Hey @DarrenStack, thanks for reporting this!

We've had some performance issues (e.g. here and here) affecting our Django integration that have been fixed only recently (as in, after 1.33), as well as a lot of other bugfixes in the recent releases, so as the first step I'd recommend updating to the current version (1.38.0) and seeing if the DB times go down again.

Couple questions in case upgrading doesn't help:

  • What DB backend are you using?
  • Do you do anything custom with regards to how you connect to the DB? E.g., defining your own get_connection_params?
  • What SDK version did you upgrade from? 1.32.0?

Sidenote regarding integration disabling, you mention that you've tried removing the Django integration -- the SDK will only actually remove it if you also provide auto_enabling_integrations=False to the SDK init, otherwise the integration will be auto-enabled in the background anyway. Since we don't really do a good job of documenting that anywhere (note to self: should fix that) I assume that might have been the case for you and the integration might have still been active.

@DarrenStack
Copy link
Author

DarrenStack commented Nov 30, 2023

Hi @sentrivana thanks so much for the quick response and all the info. The first linked ticket has a lot of overlap.

On the questions:

  • We are using postgres on AWS RDS similar to a user in the first ticket.
  • We do not overwrite get_connection_param(), we just use the DATABASES variable in our settings.py. We are not making any external calls there.
  • We upgraded from 1.32.0. It works fine.

The tip about auto_enabling_integrations=False worked well and we didn't see the issue so you are right that it's more thank likely the DjangoIntegration. I'll test cache_spans tomorrow to see if that makes a difference.

EDIT: We have tried 1.38.0 also which has not fixed the issue

@sentrivana
Copy link
Contributor

Thanks @DarrenStack! I've been looking at the diff between 1.32.0 and 1.33.0 and nothing really stands out. One long shot is extracting some Redis connection data on each operation -- do you by any chance use Redis, e.g. to cache DB queries? Do you have any other library/framework between the app and the DB, like SQLAlchemy, or are you just using plain Django ORM?

Could you add debug=True to the SDK init, start the app, and post the debug logs? It doesn't need to generate any transactions, I'm mostly interested in the initial output that is printed while the SDK is setting up, so roughly until this line:

 [sentry] DEBUG: Setting SDK name to 'sentry.python.django'

@DarrenStack
Copy link
Author

DarrenStack commented Dec 1, 2023

Hey @sentrivana, thanks again for all the support. We are not using Redis and it's just the plain Django ORM.

We have configured our cache to the database as shown here. This was initially setup for our django rate-limiting middleware.

Attached are the setup logs.

        2023-11-30T17:05:48.573+00:00	[sentry] DEBUG: Setting up integrations (with default = True)
	2023-11-30T17:05:48.576+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.aiohttp.AioHttpIntegration: AIOHTTP not installed
	2023-11-30T17:05:48.577+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.bottle.BottleIntegration: Bottle not installed
	2023-11-30T17:05:48.578+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.celery.CeleryIntegration: Celery not installed
	2023-11-30T17:05:48.578+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.falcon.FalconIntegration: Falcon not installed
	2023-11-30T17:05:48.579+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.fastapi.FastApiIntegration: Starlette is not installed
	2023-11-30T17:05:48.580+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.flask.FlaskIntegration: Flask is not installed
	2023-11-30T17:05:48.580+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.httpx.HttpxIntegration: httpx is not installed
	2023-11-30T17:05:48.580+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.pyramid.PyramidIntegration: Pyramid not installed
	2023-11-30T17:05:48.581+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.rq.RqIntegration: RQ not installed
	2023-11-30T17:05:48.582+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.sanic.SanicIntegration: Sanic not installed
	2023-11-30T17:05:48.582+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.sqlalchemy.SqlalchemyIntegration: SQLAlchemy not installed.
	2023-11-30T17:05:48.583+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.starlette.StarletteIntegration: Starlette is not installed
	2023-11-30T17:05:48.583+00:00	[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.tornado.TornadoIntegration: Tornado not installed
	2023-11-30T17:05:48.583+00:00	[sentry] DEBUG: Setting up previously not enabled integration django
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration argv
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration atexit
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration dedupe
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration excepthook
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration logging
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration modules
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration stdlib
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration threading
	2023-11-30T17:05:48.586+00:00	[sentry] DEBUG: Setting up previously not enabled integration boto3
	2023-11-30T17:05:48.588+00:00	[sentry] DEBUG: Setting up previously not enabled integration redis
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Did not enable default integration redis: Redis client not installed
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration django
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration argv
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration atexit
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration dedupe
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration excepthook
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration logging
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration modules
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration stdlib
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration threading
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Enabling integration boto3
	2023-11-30T17:05:48.589+00:00	[sentry] DEBUG: Setting SDK name to 'sentry.python.django'

@antonpirker
Copy link
Member

Ok, to it is most likely the DjangoIntegration adding the overhead. Please have a look at the options middleware_spans, signals_spans, and cache_spans here: https://docs.sentry.io/platforms/python/integrations/django/#options

Try to set all of them to False and try again if you see a big decline in overhead.

@DarrenStack
Copy link
Author

Thanks again for the all the ideas, they are very appreciated. Only got round to testing this now. Unfortunately I still see the same performance issues with the options all set to False.

I'll set aside sometime in the next week to debug some more.

@DarrenStack
Copy link
Author

Hey @sentrivana and @antonpirker, I think I found something in regards to this!

We are using psycopg2-binary to support our interactions with the postgres DB. It looks to define get_dsn_parameters as a built-in method as opposed to a function so the check here fails even though it is callable. We then fall back to get_connection_params which seems to be causing a little extra overhead.

I realise this code was added to stop calls of get_dsn_parameters where it's not really callable so I don't know how difficult it is to update this. Would inspect.isroutine work as a check?

@sentrivana
Copy link
Contributor

Hey @DarrenStack, thanks for investigating and for coming back with an update! Will look into fixing this this week.

@DarrenStack
Copy link
Author

Great! thanks @sentrivana!

@sentrivana
Copy link
Contributor

@DarrenStack The fix will be out shortly in 1.39.1. Thanks again for helping us debug!

@DarrenStack
Copy link
Author

Yup, tested 1.39.1 and it works! Thanks for the really quick work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Integration: Django Type: Bug Something isn't working
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

3 participants