Add support for OpenAI Responses API to all OpenAI-related plugins #1447
+331
−107
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Implementation Complete: OpenAI Responses API Support
Successfully added support for the OpenAI Responses API (
/v1/responses) to all five OpenAI-related plugins while maintaining full backward compatibility with existing Chat Completions and Completions APIs.Changes Made
1. New Model Classes (
DevProxy.Abstractions/LanguageModel/OpenAIModels.cs)OpenAIResponsesRequest- Handles request withinput,modalities,instructions,store,previous_response_id,tools, andmax_output_tokensOpenAIResponsesResponse- Parses response withoutputitems array,created_at, andstatusOpenAIResponsesOutputItem- Represents output items withtype,role, andcontentarrayOpenAIResponsesContentPart- Represents content parts withtypeandtextOpenAIRequest.TryGetOpenAIRequest()to detect Responses API requests (checks forinput+modalitiesfields)2. LanguageModelFailurePlugin ✅
TryGetOpenAIRequestmethod in favor of shared method3. LanguageModelRateLimitingPlugin ✅
Usageproperty)TryGetOpenAIRequestmethod4. OpenAITelemetryPlugin ✅
OpenAIResponsesRequestcase to request type detectionAddResponsesApiRequestTags()- tags operation as "responses", includes prompt, instructions, modalitiesAddResponsesApiResponseTags()- extracts usage, status, and completion text from output itemsGetOperationName()to return "responses" for Responses API5. OpenAIUsageDebuggingPlugin ✅
Usageobject which has same structure across all APIs6. OpenAIMockResponsePlugin ✅
ConvertResponsesInputToMessages()- converts Responses API input to Chat Completions format for local LLMConvertToResponsesApiResponse()- transforms local LLM response to Responses API format with proper output itemsExtractTextFromContent()helper - extracts text from various content formatsTryGetOpenAIRequestmethodKey Features
✅ Full Backward Compatibility - All existing Chat Completions and Completions functionality unchanged
✅ Unified Detection - Single
OpenAIRequest.TryGetOpenAIRequest()method handles all request types✅ Proper Priority - Responses API checked first (recommended API per OpenAI docs)
✅ Usage Tracking - Token usage, costs, and rate limiting work across all APIs
✅ Telemetry Support - Full OpenTelemetry integration for Responses API
✅ Mock Support - Local LLM mocking with automatic format conversion
✅ No SDKs - Pure C# implementation following existing patterns
Testing
All five plugins now support the OpenAI Responses API alongside existing APIs! 🎉
Original prompt
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.