Skip to content

Commit

Permalink
Merge branch 'master' into cohere
Browse files Browse the repository at this point in the history
  • Loading branch information
antonpirker authored May 10, 2024
2 parents 80ebc33 + 2cdc635 commit 6b8149a
Show file tree
Hide file tree
Showing 12 changed files with 155 additions and 60 deletions.
80 changes: 80 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,85 @@
# Changelog

## 2.1.1

- Fix trace propagation in Celery tasks started by Celery Beat. (#3047) by @antonpirker

## 2.1.0

- fix(quart): Fix Quart integration (#3043) by @szokeasaurusrex

- **New integration:** [Langchain](https://docs.sentry.io/platforms/python/integrations/langchain/) (#2911) by @colin-sentry

Usage: (Langchain is auto enabling, so you do not need to do anything special)
```python
from langchain_openai import ChatOpenAI
import sentry_sdk

sentry_sdk.init(
dsn="...",
enable_tracing=True,
traces_sample_rate=1.0,
)

llm = ChatOpenAI(model="gpt-3.5-turbo-0125", temperature=0)
```

Check out [the LangChain docs](https://docs.sentry.io/platforms/python/integrations/langchain/) for details.

- **New integration:** [Anthropic](https://docs.sentry.io/platforms/python/integrations/anthropic/) (#2831) by @czyber

Usage: (add the AnthropicIntegration to your `sentry_sdk.init()` call)
```python
from anthropic import Anthropic

import sentry_sdk

sentry_sdk.init(
dsn="...",
enable_tracing=True,
traces_sample_rate=1.0,
integrations=[AnthropicIntegration()],
)

client = Anthropic()
```
Check out [the Anthropic docs](https://docs.sentry.io/platforms/python/integrations/anthropic/) for details.

- **New integration:** [Huggingface Hub](https://docs.sentry.io/platforms/python/integrations/huggingface/) (#3033) by @colin-sentry

Usage: (Huggingface Hub is auto enabling, so you do not need to do anything special)

```python
import sentry_sdk
from huggingface_hub import InferenceClient

sentry_sdk.init(
dsn="...",
enable_tracing=True,
traces_sample_rate=1.0,
)

client = InferenceClient("some-model")
```

Check out [the Huggingface docs](https://docs.sentry.io/platforms/python/integrations/huggingface/) for details. (comming soon!)

- fix(huggingface): Reduce API cross-section for huggingface in test (#3042) by @colin-sentry
- fix(django): Fix Django ASGI integration on Python 3.12 (#3027) by @bellini666
- feat(perf): Add ability to put measurements directly on spans. (#2967) by @colin-sentry
- fix(tests): Fix trytond tests (#3031) by @sentrivana
- fix(tests): Update `pytest-asyncio` to fix CI (#3030) by @sentrivana
- fix(docs): Link to respective migration guides directly (#3020) by @sentrivana
- docs(scope): Add docstring to `Scope.set_tags` (#2978) by @szokeasaurusrex
- test(scope): Fix typos in assert error message (#2978) by @szokeasaurusrex
- feat(scope): New `set_tags` function (#2978) by @szokeasaurusrex
- test(scope): Add unit test for `Scope.set_tags` (#2978) by @szokeasaurusrex
- feat(scope): Add `set_tags` to top-level API (#2978) by @szokeasaurusrex
- test(scope): Add unit test for top-level API `set_tags` (#2978) by @szokeasaurusrex
- feat(tests): Parallelize tox (#3025) by @sentrivana
- build(deps): Bump checkouts/data-schemas from `4aa14a7` to `4381a97` (#3028) by @dependabot
- meta(license): Bump copyright year (#3029) by @szokeasaurusrex

## 2.0.1

### Various fixes & improvements
Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
copyright = "2019-{}, Sentry Team and Contributors".format(datetime.now().year)
author = "Sentry Team and Contributors"

release = "2.0.1"
release = "2.1.1"
version = ".".join(release.split(".")[:2]) # The short X.Y version.


Expand Down
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
import boto3
import sentry_sdk
import sentry_sdk


monitor_slug = "python-sdk-aws-lambda-tests-cleanup"
monitor_config = {
"schedule": {
"type": "crontab",
"value": "0 12 * * 0", # 12 o'clock on Sunday
"value": "0 12 * * 0", # 12 o'clock on Sunday
},
"timezone": "UTC",
"checkin_margin": 2,
Expand All @@ -24,7 +24,7 @@ def delete_lambda_functions(prefix="test_"):
"""
client = boto3.client("lambda", region_name="us-east-1")
functions_deleted = 0

functions_paginator = client.get_paginator("list_functions")
for functions_page in functions_paginator.paginate():
for func in functions_page["Functions"]:
Expand All @@ -39,17 +39,17 @@ def delete_lambda_functions(prefix="test_"):
print(f"Got exception: {ex}")

return functions_deleted


def lambda_handler(event, context):
functions_deleted = delete_lambda_functions()

sentry_sdk.metrics.gauge(
key="num_aws_functions_deleted",
key="num_aws_functions_deleted",
value=functions_deleted,
)

return {
'statusCode': 200,
'body': f"{functions_deleted} AWS Lambda functions deleted successfully."
"statusCode": 200,
"body": f"{functions_deleted} AWS Lambda functions deleted successfully.",
}
3 changes: 2 additions & 1 deletion sentry_sdk/_types.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@
from typing import Dict
from typing import List
from typing import Mapping
from typing import NotRequired
from typing import Optional
from typing import Tuple
from typing import Type
Expand Down Expand Up @@ -63,7 +64,7 @@
"MeasurementValue",
{
"value": float,
"unit": Optional[MeasurementUnit],
"unit": NotRequired[Optional[MeasurementUnit]],
},
)

Expand Down
2 changes: 1 addition & 1 deletion sentry_sdk/consts.py
Original file line number Diff line number Diff line change
Expand Up @@ -466,4 +466,4 @@ def _get_default_options():
del _get_default_options


VERSION = "2.0.1"
VERSION = "2.1.1"
1 change: 1 addition & 0 deletions sentry_sdk/integrations/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ def iter_default_integrations(with_auto_enabling_integrations):

_AUTO_ENABLING_INTEGRATIONS = [
"sentry_sdk.integrations.aiohttp.AioHttpIntegration",
"sentry_sdk.integrations.anthropic.AnthropicIntegration",
"sentry_sdk.integrations.ariadne.AriadneIntegration",
"sentry_sdk.integrations.arq.ArqIntegration",
"sentry_sdk.integrations.asyncpg.AsyncPGIntegration",
Expand Down
12 changes: 9 additions & 3 deletions sentry_sdk/integrations/anthropic.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,19 @@
package_version,
)

from anthropic.resources import Messages

from typing import TYPE_CHECKING

try:
from anthropic.resources import Messages

if TYPE_CHECKING:
from anthropic.types import MessageStreamEvent
except ImportError:
raise DidNotEnable("Anthropic not installed")


if TYPE_CHECKING:
from typing import Any, Iterator
from anthropic.types import MessageStreamEvent
from sentry_sdk.tracing import Span


Expand Down
25 changes: 22 additions & 3 deletions sentry_sdk/integrations/celery/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@
from typing import List
from typing import Optional
from typing import TypeVar
from typing import Union

from sentry_sdk._types import EventProcessor, Event, Hint, ExcInfo
from sentry_sdk.tracing import Span
Expand Down Expand Up @@ -223,6 +224,16 @@ def _update_celery_task_headers(original_headers, span, monitor_beat_tasks):
return updated_headers


class NoOpMgr:
def __enter__(self):
# type: () -> None
return None

def __exit__(self, exc_type, exc_value, traceback):
# type: (Any, Any, Any) -> None
return None


def _wrap_apply_async(f):
# type: (F) -> F
@wraps(f)
Expand All @@ -242,9 +253,17 @@ def apply_async(*args, **kwargs):

task = args[0]

with sentry_sdk.start_span(
op=OP.QUEUE_SUBMIT_CELERY, description=task.name
) as span:
task_started_from_beat = (
sentry_sdk.Scope.get_isolation_scope()._name == "celery-beat"
)

span_mgr = (
sentry_sdk.start_span(op=OP.QUEUE_SUBMIT_CELERY, description=task.name)
if not task_started_from_beat
else NoOpMgr()
) # type: Union[Span, NoOpMgr]

with span_mgr as span:
kwargs["headers"] = _update_celery_task_headers(
kwarg_headers, span, integration.monitor_beat_tasks
)
Expand Down
4 changes: 3 additions & 1 deletion sentry_sdk/integrations/quart.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,9 +87,11 @@ def patch_asgi_app():
# type: () -> None
old_app = Quart.__call__

@ensure_integration_enabled(QuartIntegration, old_app)
async def sentry_patched_asgi_app(self, scope, receive, send):
# type: (Any, Any, Any, Any) -> Any
if sentry_sdk.get_client().get_integration(QuartIntegration) is None:
return await old_app(self, scope, receive, send)

middleware = SentryAsgiMiddleware(lambda *a, **kw: old_app(self, *a, **kw))
middleware.__call__ = middleware._run_asgi3
return await middleware(scope, receive, send)
Expand Down
10 changes: 10 additions & 0 deletions sentry_sdk/tracing_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -421,6 +421,16 @@ def update(self, other_dict):
except AttributeError:
pass

def __repr__(self):
# type: (...) -> str
return "<PropagationContext _trace_id={} _span_id={} parent_span_id={} parent_sampled={} dynamic_sampling_context={}>".format(
self._trace_id,
self._span_id,
self.parent_span_id,
self.parent_sampled,
self.dynamic_sampling_context,
)


class Baggage:
"""
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ def get_file_text(file_name):

setup(
name="sentry-sdk",
version="2.0.1",
version="2.1.1",
author="Sentry Team and Contributors",
author_email="hello@sentry.io",
url="https://github.com/getsentry/sentry-python",
Expand Down
56 changes: 16 additions & 40 deletions tests/integrations/huggingface_hub/test_huggingface_hub.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,8 @@
import itertools
import json

import pytest
from huggingface_hub import (
InferenceClient,
TextGenerationOutput,
TextGenerationOutputDetails,
TextGenerationStreamOutput,
TextGenerationOutputToken,
TextGenerationStreamDetails,
)
from huggingface_hub.errors import OverloadedError

Expand All @@ -35,19 +29,15 @@ def test_nonstreaming_chat_completion(
client = InferenceClient("some-model")
if details_arg:
client.post = mock.Mock(
return_value=json.dumps(
[
TextGenerationOutput(
generated_text="the model response",
details=TextGenerationOutputDetails(
finish_reason="TextGenerationFinishReason",
generated_tokens=10,
prefill=[],
tokens=[], # not needed for integration
),
)
]
).encode("utf-8")
return_value=b"""[{
"generated_text": "the model response",
"details": {
"finish_reason": "length",
"generated_tokens": 10,
"prefill": [],
"tokens": []
}
}]"""
)
else:
client.post = mock.Mock(
Expand Down Expand Up @@ -96,27 +86,13 @@ def test_streaming_chat_completion(
client = InferenceClient("some-model")
client.post = mock.Mock(
return_value=[
b"data:"
+ json.dumps(
TextGenerationStreamOutput(
token=TextGenerationOutputToken(
id=1, special=False, text="the model "
),
),
).encode("utf-8"),
b"data:"
+ json.dumps(
TextGenerationStreamOutput(
token=TextGenerationOutputToken(
id=2, special=False, text="response"
),
details=TextGenerationStreamDetails(
finish_reason="length",
generated_tokens=10,
seed=0,
),
)
).encode("utf-8"),
b"""data:{
"token":{"id":1, "special": false, "text": "the model "}
}""",
b"""data:{
"token":{"id":2, "special": false, "text": "response"},
"details":{"finish_reason": "length", "generated_tokens": 10, "seed": 0}
}""",
]
)
with start_transaction(name="huggingface_hub tx"):
Expand Down

0 comments on commit 6b8149a

Please sign in to comment.