-
Notifications
You must be signed in to change notification settings - Fork 423
Description
Discussed in #721
Originally posted by robin-fiddler August 22, 2025
According to the gen_ai OpenTelemetry convention, system prompts/instructions should be represented as events. Also, the attribute key used is "system_prompt" whereas sem conv states it as "gen_ai.system.message".
In Strands, I can currently see the System Prompt only at the Agent span level. It is not being passed into the Model (chat) span.
For comparison, in other frameworks such as LangSmith, the system prompt appears in the model span as system messages. Since system prompts are eventually passed down to the LLM, they should ideally also be included in the Model span to maintain consistency and observability.
Expected Behavior
• The system prompt should be available in the Model (chat) span (in addition to being visible in the Agent span).
• This ensures parity with other frameworks and makes the tracing more consistent with the gen_ai convention.
Suggested Action
• Update the instrumentation so that system prompts are propagated to the Model span.
• Confirm alignment with the gen_ai semantic convention for proper OTEL trace representation.