-
Notifications
You must be signed in to change notification settings - Fork 30
feat: Add OpenAI Provider for AI SDK #947
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@launchdarkly/browser size report |
@launchdarkly/js-sdk-common size report |
@launchdarkly/js-client-sdk size report |
@launchdarkly/js-client-sdk-common size report |
async invokeModel(messages: LDMessage[]): Promise<ChatResponse> { | ||
// Call OpenAI chat completions API | ||
const response = await this._client.chat.completions.create({ | ||
model: this._modelName, | ||
messages, | ||
...this._parameters, | ||
}); | ||
|
||
// Generate metrics early (assumes success by default) | ||
const metrics = OpenAIProvider.createAIMetrics(response); | ||
|
||
// Safely extract the first choice content using optional chaining | ||
const content = response?.choices?.[0]?.message?.content || ''; | ||
|
||
if (!content) { | ||
this.logger?.warn('OpenAI response has no content available'); | ||
metrics.success = false; | ||
} | ||
|
||
// Create the assistant message | ||
const assistantMessage: LDMessage = { | ||
role: 'assistant', | ||
content, | ||
}; | ||
|
||
return { | ||
message: assistantMessage, | ||
metrics, | ||
}; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we may want to use the responses
API: https://platform.openai.com/docs/api-reference/responses
It's newer and has more support for more of their functionality (ie tools)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Chatted on Slack -
We will want to cut over to the responses API. We're good to launch with this if we note in the readme that it's using completions, and we should swap to responses as a fast follow if possible
🤖 I have created a release *beep* *boop* --- <details><summary>server-sdk-ai: 0.12.1</summary> ## [0.12.1](server-sdk-ai-v0.12.0...server-sdk-ai-v0.12.1) (2025-10-14) ### Bug Fixes * Improve documentation for AI SDK and AIProvider ([#958](#958)) ([17d595a](17d595a)) </details> <details><summary>server-sdk-ai-langchain: 0.1.1</summary> ## [0.1.1](server-sdk-ai-langchain-v0.1.0...server-sdk-ai-langchain-v0.1.1) (2025-10-14) ### Bug Fixes * Improve documentation for AI SDK and AIProvider ([#958](#958)) ([17d595a](17d595a)) ### Dependencies * The following workspace dependencies were updated * dependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1 </details> <details><summary>server-sdk-ai-openai: 0.1.0</summary> ## 0.1.0 (2025-10-14) ### Features * Add OpenAI Provider for AI SDK ([#947](#947)) ([5722911](5722911)) ### Dependencies * The following workspace dependencies were updated * dependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1 </details> <details><summary>server-sdk-ai-vercel: 0.1.0</summary> ## 0.1.0 (2025-10-14) ### Features * Add VercelAI Provider for AI SDK ([#948](#948)) ([1db731b](1db731b)) ### Dependencies * The following workspace dependencies were updated * dependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1 </details> --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). <!-- CURSOR_SUMMARY --> --- > [!NOTE] > Publish server AI SDK 0.12.1, introduce new OpenAI and Vercel providers (0.1.0), bump LangChain provider to 0.1.1, and align dependencies/examples to ^0.12.1. > > - **AI SDK**: > - Bump `packages/sdk/server-ai` to `0.12.1` with changelog entry. > - **AI Providers**: > - New `packages/ai-providers/server-ai-openai` at `0.1.0` with changelog; depends on `@launchdarkly/server-sdk-ai@^0.12.1`. > - New `packages/ai-providers/server-ai-vercel` at `0.1.0` with changelog; depends on `@launchdarkly/server-sdk-ai@^0.12.1`. > - Bump `packages/ai-providers/server-ai-langchain` to `0.1.1`; updates dep to `@launchdarkly/server-sdk-ai@^0.12.1` and changelog. > - **Examples**: > - Update example apps to use `@launchdarkly/server-sdk-ai@0.12.1`. > - **Release Manifest**: > - Update `.release-please-manifest.json` versions for the above packages. > > <sup>Written by [Cursor Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit 6d5d7c6. This will update automatically on new commits. Configure [here](https://cursor.com/dashboard?tab=bugbot).</sup> <!-- /CURSOR_SUMMARY --> Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Note
Introduce
packages/ai-providers/server-ai-openai
with implementation, tests, docs, and wire it into CI/release and workspace configs.packages/ai-providers/server-ai-openai
OpenAIProvider
(src/OpenAIProvider.ts
,src/index.ts
) with chat completions, metrics extraction, and client access.__tests__/OpenAIProvider.test.ts
), build/test configs (tsconfig*.json
,jest.config.js
,typedoc.json
), and package metadata (package.json
)./.github/workflows/server-ai-openai.yml
for build/test.release-please-config.json
+.release-please-manifest.json
entries.release-server-ai-openai
and output wiring in/.github/workflows/release-please.yml
./.github/workflows/manual-publish.yml
option.package.json
and TS project reference in roottsconfig.json
.README.md
to list the OpenAI provider and badges.Written by Cursor Bugbot for commit 42d12cd. This will update automatically on new commits. Configure here.