Skip to content

Add OpenAI Responses adapter, provider_options, and response.schema mapping#9

Merged
PredictabilityAtScale merged 1 commit into
mainfrom
codex/add-support-for-new-conversation-body-format
Apr 24, 2026
Merged

Add OpenAI Responses adapter, provider_options, and response.schema mapping#9
PredictabilityAtScale merged 1 commit into
mainfrom
codex/add-support-for-new-conversation-body-format

Conversation

@PredictabilityAtScale
Copy link
Copy Markdown
Owner

Motivation

  • Add first-class support for the OpenAI Responses API alongside existing chat-style OpenAI support.
  • Expose provider-specific knobs (provider_options) to surface advanced options for Anthropic and Gemini while keeping normalized front matter.
  • Support structured JSON-schema outputs via a normalized response.schema field and map it to provider-specific fields.

Description

  • Introduces a new adapter openaiResponsesAdapter in src/providers/openai-responses.ts and registers it in src/providers/index.ts and exports in src/index.ts.
  • Adds runtime openaiResponses options and OpenAIResponsesRuntimeOptions to src/providers/types.ts, and exposes openaiResponses on RenderPromptOptions in src/index.ts.
  • Extends schema (src/schema/schema.ts) with response.schema, schema_name, schema_strict, and a new provider_options structure including anthropic and gemini option shapes and allows openai-responses in provider enums and defaults.
  • Maps response.schema to provider-specific fields: OpenAI/OpenRouter use response_format.json_schema, Responses API maps to text.format.json_schema, and Gemini maps to generationConfig.responseSchema; also add warnings and behavior notes (e.g. Gemini streaming is endpoint-based).
  • Adds handling for provider_options in apply-overrides.ts and maps provider-specific options into request bodies in src/providers/anthropic.ts and src/providers/gemini.ts.
  • Updates docs (README.md, docs/api-reference.md, docs/providers.md) to document the new adapter, runtime options, provider options, and streaming nuances.
  • Adds build entry point and package export for providers/openai-responses in tsup.config.ts and package.json.
  • Extends and updates unit tests in tests/providers.test.ts to cover openai-responses, response.schema mapping, provider_options behavior, Gemini streaming warning, and provider naming.

Testing

  • Updated and ran the unit test suite via npm test (runs vitest run), which exercises the new openai-responses adapter, response.schema mappings, and provider_options; tests passed.
  • Ran adapter-level validation tests for OpenAI, Anthropic, Gemini, and Responses via the updated tests/providers.test.ts, and they succeeded.
  • Built library entries with the updated tsup config by running the library build script as part of CI (tsup), which completed successfully.

Codex Task

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: ab48cbbeb0

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread src/index.ts
Comment on lines 265 to +268
history: options.history,
toolRegistry: options.toolRegistry,
strict: options.strict,
openaiResponses: options.openaiResponses,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Pass OpenAI Responses runtime options into provider validation

renderPrompt() now forwards openaiResponses only when calling adapter.render, but provider validation is still run earlier without runtime options. That means the new mutual-exclusion guard for conversation + previous_response_id in openai-responses never executes on the main API path, so invalid combinations are emitted into request bodies and fail at the provider API instead of being caught locally.

Useful? React with 👍 / 👎.

Comment thread src/schema/schema.ts
@PredictabilityAtScale PredictabilityAtScale merged commit 8577657 into main Apr 24, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant