🔴 Required Information
Describe the Bug: Setting OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true does not populate gen_ai.prompt or gen_ai.completion attributes on generate_content spans.
These remain null in the OTLP-exported spans, making it impossible for observability backends (e.g. Dynatrace AI Observability) to display conversation content. The
gcp.vertex.agent.llm_request / gcp.vertex.agent.llm_response attributes are also null. Additionally, we observe repeated warnings: "Tried calling _add_event on an ended span."
Steps to Reproduce:
- Deploy an ADK agent with OTLP export configured
- Set env var OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
- Send a request to the agent
- Query exported spans for gen_ai.prompt, gen_ai.completion
- All attributes are null on generate_content spans
Expected Behavior: gen_ai.prompt and gen_ai.completion should contain prompt/response text per OpenTelemetry GenAI semantic conventions.
Observed Behavior:
- All prompt/response attributes are null
- Logs show: "Tried calling _add_event on an ended span." (repeated 4-5 times per invocation)
- Other span metadata works fine: gen_ai.agent.name, gen_ai.request.model, token counts, duration
Environment Details:
- ADK Library Version:
1.28.1
- OS: Linux (GKE container)
- Python Version:
3.12
Model Information:
- Are you using LiteLLM: No
- Which model: gemini-3-flash-preview
🟡 Optional Information
Regression: Unknown — first attempt at enabling prompt content capture.
Logs:
{"message": "Tried calling _add_event on an ended span.", "severity": "WARNING", "logger_name": "opentelemetry.sdk.trace"}
Additional Context: The env var is read in google/adk/telemetry/tracing.py line ~80. The _add_event on ended span warning suggests content serialization completes after the
span has already been closed by BatchSpanProcessor. Related: #4829
Minimal Reproduction Code:
from google.adk.telemetry.setup import maybe_set_otel_providers
maybe_set_otel_providers([])
With OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
gen_ai.prompt and gen_ai.completion are null on all generate_content spans
How often has this issue occurred?:
🔴 Required Information
Describe the Bug: Setting OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true does not populate gen_ai.prompt or gen_ai.completion attributes on generate_content spans.
These remain null in the OTLP-exported spans, making it impossible for observability backends (e.g. Dynatrace AI Observability) to display conversation content. The
gcp.vertex.agent.llm_request / gcp.vertex.agent.llm_response attributes are also null. Additionally, we observe repeated warnings: "Tried calling _add_event on an ended span."
Steps to Reproduce:
Expected Behavior: gen_ai.prompt and gen_ai.completion should contain prompt/response text per OpenTelemetry GenAI semantic conventions.
Observed Behavior:
Environment Details:
1.28.1
3.12
Model Information:
🟡 Optional Information
Regression: Unknown — first attempt at enabling prompt content capture.
Logs:
{"message": "Tried calling _add_event on an ended span.", "severity": "WARNING", "logger_name": "opentelemetry.sdk.trace"}
Additional Context: The env var is read in google/adk/telemetry/tracing.py line ~80. The _add_event on ended span warning suggests content serialization completes after the
span has already been closed by BatchSpanProcessor. Related: #4829
Minimal Reproduction Code:
from google.adk.telemetry.setup import maybe_set_otel_providers
maybe_set_otel_providers([])
With OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
gen_ai.prompt and gen_ai.completion are null on all generate_content spans
How often has this issue occurred?: