-
Notifications
You must be signed in to change notification settings - Fork 782
Added span support for genAI langchain llm invocation #3665
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added span support for genAI langchain llm invocation #3665
Conversation
...ntelemetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/__init__.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for putting this together. Added some comments.
...opentelemetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/utils.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
instrumentation-genai/opentelemetry-instrumentation-langchain/pyproject.toml
Outdated
Show resolved
Hide resolved
3940934
to
072b0fd
Compare
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Show resolved
Hide resolved
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Outdated
Show resolved
Hide resolved
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Outdated
Show resolved
Hide resolved
fed5254
to
01caea4
Compare
instrumentation-genai/opentelemetry-instrumentation-langchain/tests/test_llm_call.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Show resolved
Hide resolved
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Outdated
Show resolved
Hide resolved
3e1255c
to
d653fed
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My biggest concern is if this is leaking memory otherwise just some small requests
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...ry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Outdated
Show resolved
Hide resolved
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Outdated
Show resolved
Hide resolved
...emetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/span_manager.py
Show resolved
Hide resolved
...ntelemetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/__init__.py
Outdated
Show resolved
Hide resolved
...ntelemetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/__init__.py
Outdated
Show resolved
Hide resolved
instrumentation-genai/opentelemetry-instrumentation-langchain/pyproject.toml
Outdated
Show resolved
Hide resolved
cce6652
to
2be6a8b
Compare
Action items from Aug 28 2025 Python SIG:
|
instrumentation-genai/opentelemetry-instrumentation-langchain/pyproject.toml
Outdated
Show resolved
Hide resolved
520a437
to
3508a66
Compare
Sept 9 2025 SEMCOV meeting discussion.
|
a44e6b7
to
bd2cc41
Compare
d68026e
to
d5771dd
Compare
Description
Taking inspiration from traceloop's openllmetry support for langchain instrumentation, added callback handler support for llm start and end to generate span with attributes(no content) listed here.
Attributes added on span:
-> gen_ai.operation.name: Str(chat)
-> gen_ai.request.model: Str(gpt-3.5-turbo)
-> gen_ai.request.top_p: Double(0.9)
-> gen_ai.request.frequency_penalty: Double(0.5)
-> gen_ai.request.presence_penalty: Double(0.5)
-> gen_ai.request.stop_sequences: Slice(["\n","Human:","AI:"])
-> gen_ai.request.seed: Int(100)
-> gen_ai.request.max_tokens: Int(100)
-> gen_ai.provider.name: Str(openai)
-> gen_ai.request.temperature: Double(0.1)
-> gen_ai.response.finish_reasons: Slice(["stop"])
-> gen_ai.response.model: Str(gpt-3.5-turbo-0125)
-> gen_ai.response.id: Str(chatcmpl-Bz8yrvPnydD9pObv625n2CGBPHS13)
-> gen_ai.usage.input_tokens: Int(24)
-> gen_ai.usage.output_tokens: Int(7)
Please refer complete span attributes o/p: https://docs.google.com/document/d/13SXuf4iZSl2MqWUJtmoDEqFxuHZ1Y8hykW_ctlykcDE/edit?tab=t.0
We have just added span support in this PR for now to keep this PR concise and in future we will have metric and log support PRs.
Fixes # (issue)
Type of change
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
pytest -rP test_llm_call.py
Does This PR Require a Core Repo Change?
Checklist:
See contributing.md for styleguide, changelog guidelines, and more.