Skip to content

Conversation

jsonbailey
Copy link
Contributor

@jsonbailey jsonbailey commented Oct 13, 2025

Note

Introduce packages/ai-providers/server-ai-openai with implementation, tests, docs, and wire it into CI/release and workspace configs.

  • AI Providers:
    • New Package: packages/ai-providers/server-ai-openai
      • Implements OpenAIProvider (src/OpenAIProvider.ts, src/index.ts) with chat completions, metrics extraction, and client access.
      • Adds tests (__tests__/OpenAIProvider.test.ts), build/test configs (tsconfig*.json, jest.config.js, typedoc.json), and package metadata (package.json).
      • Includes package README with usage and badges.
  • CI/Release:
    • Adds workflow /.github/workflows/server-ai-openai.yml for build/test.
    • Integrates with release process:
      • release-please-config.json + .release-please-manifest.json entries.
      • New job release-server-ai-openai and output wiring in /.github/workflows/release-please.yml.
    • Enables manual publish via /.github/workflows/manual-publish.yml option.
  • Repo Config:
    • Adds workspace entry in root package.json and TS project reference in root tsconfig.json.
    • Updates root README.md to list the OpenAI provider and badges.

Written by Cursor Bugbot for commit 42d12cd. This will update automatically on new commits. Configure here.

@jsonbailey jsonbailey requested a review from a team as a code owner October 13, 2025 15:59
Copy link
Contributor

@launchdarkly/browser size report
This is the brotli compressed size of the ESM build.
Compressed size: 169118 bytes
Compressed size limit: 200000
Uncompressed size: 789399 bytes

Copy link
Contributor

@launchdarkly/js-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 24988 bytes
Compressed size limit: 26000
Uncompressed size: 122411 bytes

Copy link
Contributor

@launchdarkly/js-client-sdk size report
This is the brotli compressed size of the ESM build.
Compressed size: 21721 bytes
Compressed size limit: 25000
Uncompressed size: 74698 bytes

Copy link
Contributor

@launchdarkly/js-client-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 17636 bytes
Compressed size limit: 20000
Uncompressed size: 90259 bytes

cursor[bot]

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

Comment on lines +57 to +86
async invokeModel(messages: LDMessage[]): Promise<ChatResponse> {
// Call OpenAI chat completions API
const response = await this._client.chat.completions.create({
model: this._modelName,
messages,
...this._parameters,
});

// Generate metrics early (assumes success by default)
const metrics = OpenAIProvider.createAIMetrics(response);

// Safely extract the first choice content using optional chaining
const content = response?.choices?.[0]?.message?.content || '';

if (!content) {
this.logger?.warn('OpenAI response has no content available');
metrics.success = false;
}

// Create the assistant message
const assistantMessage: LDMessage = {
role: 'assistant',
content,
};

return {
message: assistantMessage,
metrics,
};
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we may want to use the responses API: https://platform.openai.com/docs/api-reference/responses

It's newer and has more support for more of their functionality (ie tools)

Copy link
Contributor

@andrewklatzke andrewklatzke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Chatted on Slack -

We will want to cut over to the responses API. We're good to launch with this if we note in the readme that it's using completions, and we should swap to responses as a fast follow if possible

cursor[bot]

This comment was marked as outdated.

@jsonbailey jsonbailey merged commit 5722911 into main Oct 14, 2025
33 checks passed
@jsonbailey jsonbailey deleted the jb/sdk-1455/ai-provider-openai branch October 14, 2025 20:08
@github-actions github-actions bot mentioned this pull request Oct 14, 2025
jsonbailey pushed a commit that referenced this pull request Oct 14, 2025
🤖 I have created a release *beep* *boop*
---


<details><summary>server-sdk-ai: 0.12.1</summary>

##
[0.12.1](server-sdk-ai-v0.12.0...server-sdk-ai-v0.12.1)
(2025-10-14)


### Bug Fixes

* Improve documentation for AI SDK and AIProvider
([#958](#958))
([17d595a](17d595a))
</details>

<details><summary>server-sdk-ai-langchain: 0.1.1</summary>

##
[0.1.1](server-sdk-ai-langchain-v0.1.0...server-sdk-ai-langchain-v0.1.1)
(2025-10-14)


### Bug Fixes

* Improve documentation for AI SDK and AIProvider
([#958](#958))
([17d595a](17d595a))


### Dependencies

* The following workspace dependencies were updated
  * dependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1
</details>

<details><summary>server-sdk-ai-openai: 0.1.0</summary>

## 0.1.0 (2025-10-14)


### Features

* Add OpenAI Provider for AI SDK
([#947](#947))
([5722911](5722911))


### Dependencies

* The following workspace dependencies were updated
  * dependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1
</details>

<details><summary>server-sdk-ai-vercel: 0.1.0</summary>

## 0.1.0 (2025-10-14)


### Features

* Add VercelAI Provider for AI SDK
([#948](#948))
([1db731b](1db731b))


### Dependencies

* The following workspace dependencies were updated
  * dependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1
</details>

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

<!-- CURSOR_SUMMARY -->
---

> [!NOTE]
> Publish server AI SDK 0.12.1, introduce new OpenAI and Vercel
providers (0.1.0), bump LangChain provider to 0.1.1, and align
dependencies/examples to ^0.12.1.
> 
> - **AI SDK**:
>   - Bump `packages/sdk/server-ai` to `0.12.1` with changelog entry.
> - **AI Providers**:
> - New `packages/ai-providers/server-ai-openai` at `0.1.0` with
changelog; depends on `@launchdarkly/server-sdk-ai@^0.12.1`.
> - New `packages/ai-providers/server-ai-vercel` at `0.1.0` with
changelog; depends on `@launchdarkly/server-sdk-ai@^0.12.1`.
> - Bump `packages/ai-providers/server-ai-langchain` to `0.1.1`; updates
dep to `@launchdarkly/server-sdk-ai@^0.12.1` and changelog.
> - **Examples**:
>   - Update example apps to use `@launchdarkly/server-sdk-ai@0.12.1`.
> - **Release Manifest**:
> - Update `.release-please-manifest.json` versions for the above
packages.
> 
> <sup>Written by [Cursor
Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit
6d5d7c6. This will update automatically
on new commits. Configure
[here](https://cursor.com/dashboard?tab=bugbot).</sup>
<!-- /CURSOR_SUMMARY -->

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants