-
Notifications
You must be signed in to change notification settings - Fork 30
feat: Add VercelAI Provider for AI SDK #948
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@launchdarkly/browser size report |
|
@launchdarkly/js-sdk-common size report |
|
@launchdarkly/js-client-sdk size report |
|
@launchdarkly/js-client-sdk-common size report |
🤖 I have created a release *beep* *boop* --- <details><summary>server-sdk-ai: 0.12.1</summary> ## [0.12.1](server-sdk-ai-v0.12.0...server-sdk-ai-v0.12.1) (2025-10-14) ### Bug Fixes * Improve documentation for AI SDK and AIProvider ([#958](#958)) ([17d595a](17d595a)) </details> <details><summary>server-sdk-ai-langchain: 0.1.1</summary> ## [0.1.1](server-sdk-ai-langchain-v0.1.0...server-sdk-ai-langchain-v0.1.1) (2025-10-14) ### Bug Fixes * Improve documentation for AI SDK and AIProvider ([#958](#958)) ([17d595a](17d595a)) ### Dependencies * The following workspace dependencies were updated * dependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1 </details> <details><summary>server-sdk-ai-openai: 0.1.0</summary> ## 0.1.0 (2025-10-14) ### Features * Add OpenAI Provider for AI SDK ([#947](#947)) ([5722911](5722911)) ### Dependencies * The following workspace dependencies were updated * dependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1 </details> <details><summary>server-sdk-ai-vercel: 0.1.0</summary> ## 0.1.0 (2025-10-14) ### Features * Add VercelAI Provider for AI SDK ([#948](#948)) ([1db731b](1db731b)) ### Dependencies * The following workspace dependencies were updated * dependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.0 to ^0.12.1 </details> --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). <!-- CURSOR_SUMMARY --> --- > [!NOTE] > Publish server AI SDK 0.12.1, introduce new OpenAI and Vercel providers (0.1.0), bump LangChain provider to 0.1.1, and align dependencies/examples to ^0.12.1. > > - **AI SDK**: > - Bump `packages/sdk/server-ai` to `0.12.1` with changelog entry. > - **AI Providers**: > - New `packages/ai-providers/server-ai-openai` at `0.1.0` with changelog; depends on `@launchdarkly/server-sdk-ai@^0.12.1`. > - New `packages/ai-providers/server-ai-vercel` at `0.1.0` with changelog; depends on `@launchdarkly/server-sdk-ai@^0.12.1`. > - Bump `packages/ai-providers/server-ai-langchain` to `0.1.1`; updates dep to `@launchdarkly/server-sdk-ai@^0.12.1` and changelog. > - **Examples**: > - Update example apps to use `@launchdarkly/server-sdk-ai@0.12.1`. > - **Release Manifest**: > - Update `.release-please-manifest.json` versions for the above packages. > > <sup>Written by [Cursor Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit 6d5d7c6. This will update automatically on new commits. Configure [here](https://cursor.com/dashboard?tab=bugbot).</sup> <!-- /CURSOR_SUMMARY --> Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Note
Introduce
packages/ai-providers/server-ai-vercelwith Vercel AI SDK integration, plus CI, release, and workspace wiring.packages/ai-providers/server-ai-vercelVercelProviderimplementingAIProviderwithgenerateTextinvocation, token-usage metrics, and support foropenai,anthropic,google,cohere,mistralvia@ai-sdk/*./.github/workflows/server-ai-vercel.ymlfor build/test..github/workflows/manual-publish.ymlto allow publishing the new package.package.jsonand TS project references intsconfig.json..release-please-manifest.jsonandrelease-please-config.jsonwith prerelease settings.README.mdto list the new AI provider and badges/links; fix the langchain issues link.Written by Cursor Bugbot for commit 2ffc68a. This will update automatically on new commits. Configure here.