Skip to content

Commit 33c8c77

Browse files
feat(python): add docs for LiteLLM integration (#15091)
<!-- Use this checklist to make sure your PR is ready for merge. You may delete any sections you don't need. --> ## DESCRIBE YOUR PR Add docs for the LiteLLM Python SDK integration. Requires getsentry/sentry-python#4864 Closes https://linear.app/getsentry/issue/TET-1217/litellm-docs ## IS YOUR CHANGE URGENT? Help us prioritize incoming PRs by letting us know when the change needs to go live. - [ ] Urgent deadline (GA date, etc.): <!-- ENTER DATE HERE --> - [ ] Other deadline: <!-- ENTER DATE HERE --> - [x] None: Not urgent, can wait up to 1 week+ ## SLA - Teamwork makes the dream work, so please add a reviewer to your PRs. - Please give the docs team up to 1 week to review your PR unless you've added an urgent due date to it. Thanks in advance for your help! ## PRE-MERGE CHECKLIST *Make sure you've checked the following before merging your changes:* - [ ] Checked Vercel preview for correctness, including links - [ ] PR was reviewed and approved by any necessary SMEs (subject matter experts) - [ ] PR was reviewed and approved by a member of the [Sentry docs team](https://github.com/orgs/getsentry/teams/docs) ## LEGAL BOILERPLATE <!-- Sentry employees and contractors can delete or ignore this section. --> Look, I get it. The entity doing business as "Sentry" was incorporated in the State of Delaware in 2015 as Functional Software, Inc. and is gonna need some rights from me in order to utilize my contributions in this here PR. So here's the deal: I retain all rights, title and interest in and to my contributions, and by keeping this boilerplate intact I confirm that Sentry can use, modify, copy, and redistribute my contributions, under Sentry's choice of terms. ## EXTRA RESOURCES - [Sentry Docs contributor guide](https://docs.sentry.io/contributing/) --------- Co-authored-by: Ivana Kellyer <ivana.kellyer@sentry.io>
1 parent 2d4434c commit 33c8c77

File tree

3 files changed

+120
-0
lines changed

3 files changed

+120
-0
lines changed

docs/platforms/python/integrations/index.mdx

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,7 @@ The Sentry SDK uses integrations to hook into the functionality of popular libra
4646
| <LinkWithPlatformIcon platform="openai-agents" label="OpenAI Agents SDK" url="/platforms/python/integrations/openai-agents" /> | |
4747
| <LinkWithPlatformIcon platform="langchain" label="LangChain" url="/platforms/python/integrations/langchain" /> ||
4848
| <LinkWithPlatformIcon platform="langgraph" label="LangGraph" url="/platforms/python/integrations/langgraph" /> ||
49+
| <LinkWithPlatformIcon platform="litellm" label="LiteLLM" url="/platforms/python/integrations/litellm" /> | |
4950

5051
### Data Processing
5152

Lines changed: 118 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,118 @@
1+
---
2+
title: LiteLLM
3+
description: "Learn about using Sentry for LiteLLM."
4+
---
5+
6+
This integration connects Sentry with the [LiteLLM Python SDK](https://github.com/BerriAI/litellm).
7+
8+
Once you've installed this SDK, you can use the Sentry AI Agents Monitoring, a Sentry dashboard that helps you understand what's going on with your AI requests.
9+
10+
Sentry AI Monitoring will automatically collect information about prompts, tools, tokens, and models. Learn more about the [AI Agents Dashboard](/product/insights/ai/agents).
11+
12+
## Install
13+
14+
Install `sentry-sdk` from PyPI with the `litellm` extra:
15+
16+
```bash {tabTitle:pip}
17+
pip install "sentry-sdk[litellm]"
18+
```
19+
20+
```bash {tabTitle:uv}
21+
uv add "sentry-sdk[litellm]"
22+
```
23+
24+
## Configure
25+
26+
Add `LiteLLMIntegration()` to your `integrations` list:
27+
28+
```python
29+
import sentry_sdk
30+
from sentry_sdk.integrations.litellm import LiteLLMIntegration
31+
32+
sentry_sdk.init(
33+
dsn="___PUBLIC_DSN___",
34+
# Set traces_sample_rate to 1.0 to capture 100%
35+
# of transactions for tracing.
36+
traces_sample_rate=1.0,
37+
# Add data like inputs and responses;
38+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
39+
send_default_pii=True,
40+
integrations=[
41+
LiteLLMIntegration(),
42+
],
43+
)
44+
```
45+
46+
## Verify
47+
48+
Verify that the integration works by making a chat completion request to LiteLLM.
49+
50+
```python
51+
import sentry_sdk
52+
from sentry_sdk.integrations.litellm import LiteLLMIntegration
53+
import litellm
54+
55+
sentry_sdk.init(
56+
dsn="___PUBLIC_DSN___",
57+
traces_sample_rate=1.0,
58+
send_default_pii=True,
59+
integrations=[
60+
LiteLLMIntegration(),
61+
],
62+
)
63+
64+
response = litellm.completion(
65+
model="gpt-3.5-turbo",
66+
messages=[{"role": "user", "content": "say hello"}],
67+
max_tokens=100
68+
)
69+
print(response.choices[0].message.content)
70+
```
71+
72+
After running this script, the resulting data should show up in the `AI Spans` tab on the `Explore > Traces > Trace` page on Sentry.io.
73+
74+
If you manually created an <PlatformLink to="/tracing/instrumentation/custom-instrumentation/ai-agents-module/#invoke-agent-span">Invoke Agent Span</PlatformLink> (not done in the example above), the data will also show up in the [AI Agents Dashboard](/product/insights/ai/agents).
75+
76+
It may take a couple of moments for the data to appear in [sentry.io](https://sentry.io).
77+
78+
## Behavior
79+
80+
- The LiteLLM integration will connect Sentry with the supported LiteLLM methods automatically.
81+
82+
- The supported functions are currently `completion` and `embedding` (both sync and async).
83+
84+
- Sentry considers LLM inputs/outputs as PII (Personally identifiable information) and doesn't include PII data by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False` as shown in the [Options section](#options) below.
85+
86+
## Options
87+
88+
By adding `LiteLLMIntegration` to your `sentry_sdk.init()` call explicitly, you can set options for `LiteLLMIntegration` to change its behavior:
89+
90+
```python
91+
import sentry_sdk
92+
from sentry_sdk.integrations.litellm import LiteLLMIntegration
93+
94+
sentry_sdk.init(
95+
# ...
96+
# Add data like inputs and responses;
97+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
98+
send_default_pii=True,
99+
integrations=[
100+
LiteLLMIntegration(
101+
include_prompts=False, # LLM inputs/outputs will be not sent to Sentry, despite send_default_pii=True
102+
),
103+
],
104+
)
105+
```
106+
107+
You can pass the following keyword arguments to `LiteLLMIntegration()`:
108+
109+
- `include_prompts`:
110+
111+
Whether LLM inputs and outputs should be sent to Sentry. Sentry considers this data personal identifiable data (PII) by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False`.
112+
113+
The default is `True`.
114+
115+
## Supported Versions
116+
117+
- LiteLLM: 1.77.0+
118+
- Python: 3.8+

docs/platforms/python/tracing/instrumentation/custom-instrumentation/ai-agents-module.mdx

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@ The Python SDK supports automatic instrumentation for some AI libraries. We reco
1717
- <PlatformLink to="/integrations/openai-agents/">OpenAI Agents SDK</PlatformLink>
1818
- <PlatformLink to="/integrations/langchain/">LangChain</PlatformLink>
1919
- <PlatformLink to="/integrations/langgraph/">LangGraph</PlatformLink>
20+
- <PlatformLink to="/integrations/litellm/">LiteLLM</PlatformLink>
2021

2122
## Manual Instrumentation
2223

0 commit comments

Comments
 (0)