Skip to content

Switch to OpenAI Responses API#1981

Merged
kodiakhq[bot] merged 2 commits intohyperdxio:mainfrom
vinzee:va/openai_fixes
Mar 25, 2026
Merged

Switch to OpenAI Responses API#1981
kodiakhq[bot] merged 2 commits intohyperdxio:mainfrom
vinzee:va/openai_fixes

Conversation

@vinzee
Copy link
Copy Markdown
Contributor

@vinzee vinzee commented Mar 24, 2026

Summary

#1960 added support for OpenAI's chat completions api.

This change switches to using OpenAI's new Responses API instead.

How to test locally or on Vercel

How to test locally

  1. Set env vars:
    AI_PROVIDER=openai AI_API_KEY= AI_BASE_URL=<> AI_MODEL_NAME=<> AI_REQUEST_HEADERS={"X-Client-Id":"","X-Username":"", AI_ADDITIONAL_OPTIONS = {API_TYPE: "responses"}}
  2. Open Hyperdx's chart explorer and use the AI assistant chart builder
    • e.g. "show me error count by service in the last hour"
  3. Confirm the assistant returns a valid chart config.

References

  • Linear Issue:
  • Related PRs:

@vercel
Copy link
Copy Markdown

vercel bot commented Mar 24, 2026

@vinzee is attempting to deploy a commit to the HyperDX Team on Vercel.

A member of the Team first needs to authorize it.

@changeset-bot
Copy link
Copy Markdown

changeset-bot bot commented Mar 24, 2026

🦋 Changeset detected

Latest commit: 4ffac21

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 3 packages
Name Type
@hyperdx/api Patch
@hyperdx/app Patch
@hyperdx/otel-collector Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@vinzee vinzee changed the title Update OpenAI model configuration to use the new Responses API Switch to OpenAI's new Responses API Mar 24, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 24, 2026

PR Review

  • ⚠️ Stale error message — Line 383 still references "claude-sonnet-4-5-20250929" for LiteLLM proxies as an example, but LiteLLM proxies use Chat Completions (/v1/chat/completions), not the Responses API (/v1/responses). This will mislead users into thinking LiteLLM is still supported → Update or remove the LiteLLM example from the error message.

  • ⚠️ Breaks OpenAI-compatible endpoints — The previous implementation explicitly supported Azure OpenAI, OpenRouter, and LiteLLM proxies (Chat Completions-compatible). Switching unconditionally to openai.responses() will silently break these providers since they don't implement the Responses API. The testing instructions reference AI_ADDITIONAL_OPTIONS = {API_TYPE: "responses"} as a toggle, but no such logic exists in the code → Either add a config-driven fallback to openai.chat() for Chat Completions-compatible providers, or clearly document this as a breaking change and bump the version accordingly (this is a patch changeset but removes existing functionality).

@vinzee vinzee force-pushed the va/openai_fixes branch 3 times, most recently from 45eaa25 to 250f07b Compare March 24, 2026 17:13
@vinzee vinzee changed the title Switch to OpenAI's new Responses API Add support for OpenAI's new Responses API Mar 24, 2026
@vinzee vinzee changed the title Add support for OpenAI's new Responses API Add support for OpenAI Responses API Mar 24, 2026
@brandon-pereira brandon-pereira self-requested a review March 24, 2026 17:57
Comment thread packages/api/src/controllers/ai.ts Outdated
});

return openai.chat(config.AI_MODEL_NAME);
switch (additionalOptions.api_type) {
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nobody has used this openai sdk since we only added it yesterday in your other PR (not released yet).

Why not just change to openai.responses for everything instead of the configuration? It's the default in the AI SDK

Since AI SDK 5, the OpenAI responses API is called by default (unless you specify e.g. 'openai.chat')

https://ai-sdk.dev/providers/ai-sdk-providers/openai

Thoughts?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure, I actually was doing that before. changed after the automated review comments 😄 . Let me make it the default and remove AI_ADDITIONAL_OPTIONS

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done ✅

@vinzee vinzee changed the title Add support for OpenAI Responses API Switch to OpenAI Responses API Mar 24, 2026
Copy link
Copy Markdown
Member

@brandon-pereira brandon-pereira left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM - thanks again

@brandon-pereira
Copy link
Copy Markdown
Member

@vinzee looks like some linting issues

@vercel
Copy link
Copy Markdown

vercel bot commented Mar 25, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
hyperdx-oss Ready Ready Preview, Comment Mar 25, 2026 1:45am

Request Review

@kodiakhq kodiakhq bot merged commit 629009d into hyperdxio:main Mar 25, 2026
13 of 15 checks passed
knudtty pushed a commit that referenced this pull request Apr 16, 2026
## Summary

#1960 added support for OpenAI's chat completions api.

This change switches to using [OpenAI's new Responses API](https://developers.openai.com/api/docs/guides/migrate-to-responses) instead.

### How to test locally or on Vercel

### How to test locally
1. Set env vars:
`AI_PROVIDER=openai AI_API_KEY= AI_BASE_URL=<> AI_MODEL_NAME=<> AI_REQUEST_HEADERS={"X-Client-Id":"","X-Username":"", AI_ADDITIONAL_OPTIONS = {API_TYPE: "responses"}}`
3. Open Hyperdx's chart explorer and use the AI assistant chart builder
   - e.g. "show me error count by service in the last hour"
4. Confirm the assistant returns a valid chart config.

### References



- Linear Issue:
- Related PRs:
Copilot AI pushed a commit that referenced this pull request Apr 20, 2026
## Summary

#1960 added support for OpenAI's chat completions api.

This change switches to using [OpenAI's new Responses API](https://developers.openai.com/api/docs/guides/migrate-to-responses) instead.

### How to test locally or on Vercel

### How to test locally
1. Set env vars:
`AI_PROVIDER=openai AI_API_KEY= AI_BASE_URL=<> AI_MODEL_NAME=<> AI_REQUEST_HEADERS={"X-Client-Id":"","X-Username":"", AI_ADDITIONAL_OPTIONS = {API_TYPE: "responses"}}`
3. Open Hyperdx's chart explorer and use the AI assistant chart builder
   - e.g. "show me error count by service in the last hour"
4. Confirm the assistant returns a valid chart config.

### References



- Linear Issue:
- Related PRs:
Co-authored-by: peter-leonov-ch <209667683+peter-leonov-ch@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants