-
Notifications
You must be signed in to change notification settings - Fork 0
Add support for OpenAI Responses API instrumentation #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Co-authored-by: johanste <15110018+johanste@users.noreply.github.com>
Co-authored-by: johanste <15110018+johanste@users.noreply.github.com>
Co-authored-by: johanste <15110018+johanste@users.noreply.github.com>
|
@copilot, only try to instrument responses if the version of the openai client library version installed supports responses. And only run the responses test if the openai client library version supports it |
Co-authored-by: johanste <15110018+johanste@users.noreply.github.com>
Added version checking in commit 6b5f542. The instrumentation now:
|
|
@copilot, doesn't openai version 1.66 and later support responses? Use the lowest version you can for responses support. |
…version) Co-authored-by: johanste <15110018+johanste@users.noreply.github.com>
Updated to use OpenAI >= 1.66.0 in commit e9c4a2f. The version check now uses the lowest supported version (1.66.0) where the responses API was first introduced. |
|
@copilot, fix the imports of ChatCompletion. It should be from openai.types.chat, not from openai.resources.chat.completions. |
….resources.chat.completions Co-authored-by: johanste <15110018+johanste@users.noreply.github.com>
Fixed in commit f8a2e1c. Changed imports from |
…onversation_support updated responses and conversation support
Summary
This PR adds instrumentation support for the OpenAI Responses API (structured outputs) to the
opentelemetry-instrumentation-openai-v2library, following the same monkeypatching pattern used for chat completions.Background
The OpenAI SDK introduced the Responses API (
client.responses.create) for structured outputs in version 1.66.0. This API was not previously instrumented, meaning calls to it would not generate telemetry data (spans, logs, or metrics).Changes
This PR instruments both synchronous and asynchronous versions of the Responses API:
Implementation Details
Version Checking:
_is_responses_api_supported()function to detect if OpenAI SDK >= 1.66.0packaging.versionfor reliable version comparisonNew wrapper functions in
patch.py:responses_create()- Wraps synchronousResponses.createmethodasync_responses_create()- Wraps asynchronousAsyncResponses.createmethod_set_responses_attributes()- Sets span attributes for responses_record_responses_metrics()- Records metrics for responses API callsInstrumentation hooks in
__init__.py:wrap_function_wrappercalls foropenai.resources.responses.responses.Responses.createwrap_function_wrappercalls foropenai.resources.responses.responses.AsyncResponses.createunwrapcalls in_uninstrument()methodTelemetry Captured
The instrumentation captures (when responses API is available):
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true)Tests
Added comprehensive test coverage with version-aware skipping:
test_responses.py- Tests for synchronous responses API with/without content capture (skipped if OpenAI < 1.66.0)test_async_responses.py- Tests for asynchronous responses API with/without content capture (skipped if OpenAI < 1.66.0)Documentation
Updated documentation to include responses API examples:
README.rst- Added usage example showing both chat completions and responses API__init__.py- Added responses API exampleBug Fixes
openai.types.chatinstead ofopenai.resources.chat.completionsTesting
Verified that:
Compatibility
Backward Compatibility
This implementation maintains full backward compatibility. Users with OpenAI SDK versions < 1.66.0 will continue to have chat completions instrumented while responses API instrumentation is gracefully skipped.
Original prompt
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.