Skip to content

Feature request: Add /v1/responses endpoint for OpenAI Responses API compatibility #56

@lengxii

Description

@lengxii

Background

OpenAI introduced the Responses API (POST /v1/responses) as a successor to Chat Completions. Tools like Codex CLI (wire_api = "responses") exclusively use this endpoint and cannot fall back to /v1/chat/completions.

Currently WindsurfAPI returns 404 Not Found for /v1/responses, which makes it incompatible with Codex and similar agents.

Proposed Solution

Add a /v1/responses endpoint that:

  1. Accepts the Responses API request format (input field instead of messages, reasoning.effort, etc.)
  2. Converts to the existing Chat Completions handler internally
  3. Converts the CC response back to Responses API format (output array with type: "message" items)
  4. Supports both streaming (SSE with response.created, response.output_text.delta, response.completed events) and non-streaming modes

Example Request

POST /v1/responses
{
  "model": "gpt-5-5-xhigh",
  "input": "Hello",
  "reasoning": {"effort": "high"},
  "stream": true
}

Example Response (non-streaming)

{
  "id": "resp-xxx",
  "object": "response",
  "model": "gpt-5-5-xhigh",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "content": [{"type": "output_text", "text": "Hi!"}],
      "status": "completed"
    }
  ],
  "usage": {...},
  "status": "completed"
}

This would make WindsurfAPI a drop-in replacement for Codex, Claude Code (already supported via /v1/messages), and standard Chat Completions clients.

I have a working implementation and can submit a PR if interested.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestfixed已修复 等待确认help wantedExtra attention is needed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions