Skip to content

Add Responses API compatibility alongside chat/completions #231

@toreleon

Description

@toreleon

Problem

The proxy currently exposes , , and related compatibility routes, but it does not expose OpenAI Responses API routes. This blocks models that are available only through Responses API, such as , even when the underlying GitHub Copilot account can use them.

Proposed change

  • add
  • add
  • forward requests to the upstream GitHub Copilot endpoint
  • include verbose logging and token-count reporting similar to the existing chat-completions handler

Related PR

Implemented here: #230

Validation

  • source build passes with ℹ tsdown v0.15.6 powered by rolldown v1.0.0-beta.41
    ℹ Using tsdown config: /Users/ThangLT4/Desktop/code/copilot-api/tsdown.config.ts
    ℹ entry: src/main.ts
    ℹ target: es2022
    ℹ tsconfig: tsconfig.json
    ℹ Build start
    ℹ Cleaning 2 files
    ℹ Granting execute permission to dist/main.js
    ℹ dist/main.js 51.04 kB │ gzip: 12.83 kB
    ℹ dist/main.js.map 112.86 kB │ gzip: 27.10 kB
    ℹ 2 files, total: 163.90 kB
    ✔ Build complete in 532ms
  • locally verified and through
  • verified that is not usable via , so Responses support is required

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions