Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Nov 16, 2025

Implementation Complete: OpenAI Responses API Support

Successfully added support for the OpenAI Responses API (/v1/responses) to all five OpenAI-related plugins while maintaining full backward compatibility with existing Chat Completions and Completions APIs.

Changes Made

1. New Model Classes (DevProxy.Abstractions/LanguageModel/OpenAIModels.cs)

  • OpenAIResponsesRequest - Handles request with input, modalities, instructions, store, previous_response_id, tools, and max_output_tokens
  • OpenAIResponsesResponse - Parses response with output items array, created_at, and status
  • OpenAIResponsesOutputItem - Represents output items with type, role, and content array
  • OpenAIResponsesContentPart - Represents content parts with type and text
  • Updated OpenAIRequest.TryGetOpenAIRequest() to detect Responses API requests (checks for input + modalities fields)
  • Detection order prioritizes Responses API first as recommended by OpenAI

2. LanguageModelFailurePlugin

  • Detects Responses API requests using shared detection method
  • For string input: appends fault prompt to input text
  • For structured items: adds fault as new message item to maintain conversation flow
  • Removed duplicate TryGetOpenAIRequest method in favor of shared method

3. LanguageModelRateLimitingPlugin

  • Detects Responses API requests using shared detection method
  • Tracks token usage from Responses API responses (parses Usage property)
  • Throttling behavior works identically for Responses API
  • Custom error responses compatible with both APIs
  • Removed duplicate TryGetOpenAIRequest method

4. OpenAITelemetryPlugin

  • Added OpenAIResponsesRequest case to request type detection
  • Created AddResponsesApiRequestTags() - tags operation as "responses", includes prompt, instructions, modalities
  • Created AddResponsesApiResponseTags() - extracts usage, status, and completion text from output items
  • Updated GetOperationName() to return "responses" for Responses API
  • Cost calculation and metrics work with existing pricing data

5. OpenAIUsageDebuggingPlugin

  • Already compatible! Uses shared detection which now includes Responses API
  • Parses Usage object which has same structure across all APIs
  • CSV logging works seamlessly for Responses API requests

6. OpenAIMockResponsePlugin

  • Detects Responses API requests using shared detection method
  • Created ConvertResponsesInputToMessages() - converts Responses API input to Chat Completions format for local LLM
    • Handles string input
    • Handles structured items with roles and content
    • Includes instructions as system message
  • Created ConvertToResponsesApiResponse() - transforms local LLM response to Responses API format with proper output items
  • Created ExtractTextFromContent() helper - extracts text from various content formats
  • Removed duplicate TryGetOpenAIRequest method

Key Features

Full Backward Compatibility - All existing Chat Completions and Completions functionality unchanged
Unified Detection - Single OpenAIRequest.TryGetOpenAIRequest() method handles all request types
Proper Priority - Responses API checked first (recommended API per OpenAI docs)
Usage Tracking - Token usage, costs, and rate limiting work across all APIs
Telemetry Support - Full OpenTelemetry integration for Responses API
Mock Support - Local LLM mocking with automatic format conversion
No SDKs - Pure C# implementation following existing patterns

Testing

  • Solution builds successfully without warnings or errors
  • All code analysis rules pass
  • Ready for manual testing with actual Responses API calls

All five plugins now support the OpenAI Responses API alongside existing APIs! 🎉

Original prompt

This section details on the original issue you should resolve

<issue_title>Add support for OpenAI Responses API</issue_title>
<issue_description>OpenAI recommends the use of Responses API instead of Chat Completions for new projects. Let's add support for it for all OpenAI-related plugins so that we support both chat completions and responses. More information https://platform.openai.com/docs/guides/migrate-to-responses

Update:

  • LanguageModelFailurePlugin
  • LanguageModelRateLimitingPlugin
  • OpenAITelemetryPlugin
  • OpenAIUsageDebuggingPlugin
  • OpenAIMockResponsePlugin

Ensure that existing functionality keeps working unaltered.</issue_description>

<agent_instructions>For each plugin analyze how it uses OpenAI Chat Completions APIs. Based on your analysis, make a plan to extend each plugin's functionality with support for OpenAI Responses API. Ensure that the existing functionality keeps working. Don't add any SDKs. Use the OpenAPI Responses API documentation to make classes that represent the API request and response for deserialization. Validate the plan with me before proceeding.</agent_instructions>

Comments on the Issue (you are @copilot in this section)


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Copilot AI changed the title [WIP] Add support for OpenAI Responses API in plugins Add support for OpenAI Responses API to all OpenAI-related plugins Nov 16, 2025
Copilot finished work on behalf of waldekmastykarz November 16, 2025 10:41
@waldekmastykarz
Copy link
Collaborator

waldekmastykarz commented Nov 16, 2025

@copilot looks solid, implement it

Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Nov 16, 2025

@copilot looks solid, implement it

Implementation complete! ✅ All five plugins now support the OpenAI Responses API alongside existing APIs. Commit: 1f426b5

Key changes:

  • Added Responses API model classes (OpenAIResponsesRequest, OpenAIResponsesResponse, etc.)
  • Extended request detection to recognize Responses API (input + modalities fields)
  • Updated all 5 plugins to handle Responses API requests and responses
  • Maintained full backward compatibility with Chat Completions and Completions APIs
  • Solution builds successfully with 0 warnings/errors
  • CodeQL security scan passed with 0 alerts

The implementation follows the existing patterns and prioritizes the Responses API in detection as recommended by OpenAI.

Copilot finished work on behalf of waldekmastykarz November 16, 2025 11:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add support for OpenAI Responses API

2 participants