Skip to content

Conversation

@jsonbailey
Copy link
Contributor

@jsonbailey jsonbailey commented Nov 5, 2025

Note

Adds structured-output invokeStructuredModel with JSON schema parsing and metrics, and hardens invokeModel with try/catch and empty-content handling; comprehensive tests included.

  • Server OpenAI Provider (src/OpenAIProvider.ts):
    • Structured Output: New invokeStructuredModel(messages, responseStructure) using response_format: json_schema, parses JSON payload, returns data, rawResponse, and metrics; handles missing content, JSON parse failures, and API errors with logging.
    • invokeModel: Wrapped in try/catch; derives metrics early; warns and marks failure on empty content.
    • Import StructuredResponse type.
  • Tests (__tests__/OpenAIProvider.test.ts):
    • Add tests for invokeStructuredModel success, missing content, invalid JSON, and empty choices.
    • Extend invokeModel tests for empty/missing choices and metrics expectations.

Written by Cursor Bugbot for commit 5a3cffe. This will update automatically on new commits. Configure here.

@jsonbailey jsonbailey requested a review from a team as a code owner November 5, 2025 14:09
@github-actions
Copy link
Contributor

github-actions bot commented Nov 5, 2025

@launchdarkly/browser size report
This is the brotli compressed size of the ESM build.
Compressed size: 169118 bytes
Compressed size limit: 200000
Uncompressed size: 789399 bytes

@github-actions
Copy link
Contributor

github-actions bot commented Nov 5, 2025

@launchdarkly/js-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 24988 bytes
Compressed size limit: 26000
Uncompressed size: 122411 bytes

@github-actions
Copy link
Contributor

github-actions bot commented Nov 5, 2025

@launchdarkly/js-client-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 17636 bytes
Compressed size limit: 20000
Uncompressed size: 90259 bytes

@github-actions
Copy link
Contributor

github-actions bot commented Nov 5, 2025

@launchdarkly/js-client-sdk size report
This is the brotli compressed size of the ESM build.
Compressed size: 21721 bytes
Compressed size limit: 25000
Uncompressed size: 74698 bytes

@jsonbailey jsonbailey changed the title feat! : Support invoke with structured output in OpenAI provider feat!: Support invoke with structured output in OpenAI provider Nov 5, 2025
@tanderson-ld tanderson-ld self-requested a review November 5, 2025 20:52
@jsonbailey jsonbailey merged commit 515dbdf into main Nov 5, 2025
32 checks passed
@jsonbailey jsonbailey deleted the jb/sdk-1523/structed-model-provider-openai branch November 5, 2025 21:17
@github-actions github-actions bot mentioned this pull request Nov 5, 2025
jsonbailey added a commit that referenced this pull request Nov 6, 2025
🤖 I have created a release *beep* *boop*
---


<details><summary>server-sdk-ai: 0.14.0</summary>

##
[0.14.0](server-sdk-ai-v0.13.0...server-sdk-ai-v0.14.0)
(2025-11-06)


### ⚠ BREAKING CHANGES

* Removed deprecated Vercel methods
([#983](#983))
* Add support for real time judge evals
([#969](#969))
* AI Config defaults require the "enabled" attribute
* Renamed LDAIAgentConfig to LDAIAgentConfigRequest for clarity
* Renamed LDAIAgent to LDAIAgentConfig *note the previous use of this
name
* Renamed LDAIAgentDefault to LDAIAgentConfigDefault for clarity
* Renamed LDAIDefaults to LDAICompletionConfigDefault for clarity

### Features

* Add support for real time judge evals
([#969](#969))
([6ecd9ab](6ecd9ab))
* Added createJudge method
([6ecd9ab](6ecd9ab))
* Added judgeConfig method to AI SDK to retrieve an AI Judge Config
([6ecd9ab](6ecd9ab))
* Added trackEvalScores method to config tracker
([6ecd9ab](6ecd9ab))
* Chat will evaluate responses with configured judges
([6ecd9ab](6ecd9ab))
* Include AI SDK version in tracking information
([#985](#985))
([ef90564](ef90564))
* Removed deprecated Vercel methods
([#983](#983))
([960a499](960a499))


### Bug Fixes

* AI Config defaults require the "enabled" attribute
([6ecd9ab](6ecd9ab))
* Renamed LDAIAgent to LDAIAgentConfig *note the previous use of this
name
([6ecd9ab](6ecd9ab))
* Renamed LDAIAgentConfig to LDAIAgentConfigRequest for clarity
([6ecd9ab](6ecd9ab))
* Renamed LDAIAgentDefault to LDAIAgentConfigDefault for clarity
([6ecd9ab](6ecd9ab))
* Renamed LDAIDefaults to LDAICompletionConfigDefault for clarity
([6ecd9ab](6ecd9ab))
</details>

<details><summary>server-sdk-ai-langchain: 0.3.0</summary>

##
[0.3.0](server-sdk-ai-langchain-v0.2.0...server-sdk-ai-langchain-v0.3.0)
(2025-11-06)


### ⚠ BREAKING CHANGES

* Support invoke with structured output in LangChain provider
([#970](#970))

### Features

* Support invoke with structured output in LangChain provider
([#970](#970))
([0427908](0427908))


### Dependencies

* The following workspace dependencies were updated
  * devDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.13.0 to ^0.14.0
  * peerDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.2 to ^0.14.0
</details>

<details><summary>server-sdk-ai-openai: 0.3.0</summary>

##
[0.3.0](server-sdk-ai-openai-v0.2.0...server-sdk-ai-openai-v0.3.0)
(2025-11-06)


### ⚠ BREAKING CHANGES

* Support invoke with structured output in OpenAI provider
([#980](#980))

### Features

* Support invoke with structured output in OpenAI provider
([#980](#980))
([515dbdf](515dbdf))


### Dependencies

* The following workspace dependencies were updated
  * devDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.13.0 to ^0.14.0
  * peerDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.2 to ^0.14.0
</details>

<details><summary>server-sdk-ai-vercel: 0.3.0</summary>

##
[0.3.0](server-sdk-ai-vercel-v0.2.0...server-sdk-ai-vercel-v0.3.0)
(2025-11-06)


### ⚠ BREAKING CHANGES

* Support invoke with structured output in VercelAI provider
([#981](#981))

### Features

* Support invoke with structured output in VercelAI provider
([#981](#981))
([d0cb41d](d0cb41d))


### Dependencies

* The following workspace dependencies were updated
  * devDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.13.0 to ^0.14.0
  * peerDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.2 to ^0.14.0
</details>

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

<!-- CURSOR_SUMMARY -->
---

> [!NOTE]
> Release server-ai 0.14.0 (judge evals, breaking renames/removals) and
update LangChain/OpenAI/Vercel providers to 0.3.0 with structured
output; refresh examples and manifests to new versions.
> 
> - **SDK (`packages/sdk/server-ai`) — `0.14.0`**
> - Adds real-time judge evaluations and related APIs (`createJudge`,
`judgeConfig`, `trackEvalScores`); includes SDK version in tracking.
> - Breaking: removes deprecated Vercel methods; requires `enabled` in
AI Config defaults; renames several AI config types.
> - **AI Providers — `0.3.0`**
> - `@launchdarkly/server-sdk-ai-langchain`, `-openai`, `-vercel`: add
structured output support for `invoke` (breaking changes).
> - Bump peer/dev dependency on `@launchdarkly/server-sdk-ai` to
`^0.14.0`.
> - **Examples**
> - Update example apps to use `@launchdarkly/server-sdk-ai@0.14.0` and
provider packages `^0.3.0`.
> - **Release metadata**
>   - Update `.release-please-manifest.json` with new versions.
> 
> <sup>Written by [Cursor
Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit
00cd808. This will update automatically
on new commits. Configure
[here](https://cursor.com/dashboard?tab=bugbot).</sup>
<!-- /CURSOR_SUMMARY -->

---------

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: jsonbailey <jbailey@launchdarkly.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants