Skip to content

ci: Version Packages#498

Merged
AlemTuzlak merged 1 commit intomainfrom
changeset-release/main
Apr 24, 2026
Merged

ci: Version Packages#498
AlemTuzlak merged 1 commit intomainfrom
changeset-release/main

Conversation

@github-actions
Copy link
Copy Markdown
Contributor

@github-actions github-actions Bot commented Apr 23, 2026

This PR was opened by the Changesets release GitHub action. When you're ready to do a release, you can merge this and the packages will be published to npm automatically. If you're not ready to do a release yet, that's fine, whenever you add more changesets to main, this PR will be updated.

Releases

@tanstack/ai@0.14.0

Minor Changes

  • feat: add generateAudio activity for music and sound-effect generation (#463)

    Adds a new audio activity kind alongside the existing tts and transcription activities:

    • generateAudio() / createAudioOptions() functions
    • AudioAdapter interface and BaseAudioAdapter base class
    • AudioGenerationOptions / AudioGenerationResult / GeneratedAudio types
    • audio:request:started, audio:request:completed, and audio:usage devtools events
  • feat: add useGenerateAudio hook and streaming support for generateAudio() (#463)

    Closes the parity gap between audio generation and the other media
    activities (image, speech, video, transcription, summarize):

    • generateAudio() now accepts stream: true, returning an
      AsyncIterable<StreamChunk> that can be piped through
      toServerSentEventsResponse().
    • AudioGenerateInput type added to @tanstack/ai-client.
    • useGenerateAudio hook added to @tanstack/ai-react,
      @tanstack/ai-solid, and @tanstack/ai-vue; matching
      createGenerateAudio added to @tanstack/ai-svelte. All follow the same
      { generate, result, isLoading, error, status, stop, reset } shape as
      the existing media hooks and support both connection (SSE) and
      fetcher transports.
  • Tighten GeneratedImage and GeneratedAudio to enforce exactly one of url or b64Json via a mutually-exclusive GeneratedMediaSource union. (#463)

    Both types previously declared url? and b64Json? as independently optional, which allowed meaningless {} values and objects that set both fields. They now require exactly one:

    type GeneratedMediaSource =
      | { url: string; b64Json?: never }
      | { b64Json: string; url?: never }

    Existing read patterns like img.url || \data:image/png;base64,${img.b64Json}`continue to work unchanged. The only runtime-visible change is that the@tanstack/ai-openrouterand@tanstack/ai-falimage adapters no longer populateurlwith a synthesizeddata:image/png;base64,...URI when the provider returns base64 — they return{ b64Json }only. Consumers that want a data URI should build it fromb64Json` at render time.

Patch Changes

  • refactor(ai, ai-openai): narrow error handling before logging (#465)

    catch (error: any) sites in stream-to-response.ts, activities/stream-generation-result.ts, and activities/generateVideo/index.ts are now narrowed to unknown and funnel through a shared toRunErrorPayload(error, fallback) helper that extracts message / code without leaking the original error object (which can carry request state from an SDK).

    Replaced four console.error calls in the OpenAI text adapter's chatStream catch block that dumped the full error object to stdout. SDK errors can carry the original request including auth headers, so the library now logs only the narrowed { message, code } payload via the internal logger — any user-supplied logger receives the sanitized shape, not the raw SDK error.

  • Updated dependencies []:

    • @tanstack/ai-event-client@0.2.8

@tanstack/ai-client@0.8.0

Minor Changes

  • feat: add useGenerateAudio hook and streaming support for generateAudio() (#463)

    Closes the parity gap between audio generation and the other media
    activities (image, speech, video, transcription, summarize):

    • generateAudio() now accepts stream: true, returning an
      AsyncIterable<StreamChunk> that can be piped through
      toServerSentEventsResponse().
    • AudioGenerateInput type added to @tanstack/ai-client.
    • useGenerateAudio hook added to @tanstack/ai-react,
      @tanstack/ai-solid, and @tanstack/ai-vue; matching
      createGenerateAudio added to @tanstack/ai-svelte. All follow the same
      { generate, result, isLoading, error, status, stop, reset } shape as
      the existing media hooks and support both connection (SSE) and
      fetcher transports.

Patch Changes

  • fix(ai-client): prevent drainPostStreamActions re-entrancy stealing queued actions (#429)

    When multiple client tools complete in the same round, nested drainPostStreamActions() calls from streamResponse()'s finally block could steal queued actions, permanently stalling the conversation. Added a re-entrancy guard and a shouldAutoSend() check requiring tool-call parts before triggering continuation.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 54523f5]:

    • @tanstack/ai@0.14.0
    • @tanstack/ai-event-client@0.2.8

@tanstack/ai-fal@0.7.0

Minor Changes

  • feat: add audio, speech, and transcription adapters to @tanstack/ai-fal (#463)

    Adds three new tree-shakeable adapters alongside the existing falImage() and falVideo():

    • falSpeech() — text-to-speech via models like Google fal-ai/gemini-3.1-flash-tts, fal-ai/elevenlabs/tts/eleven-v3, fal-ai/minimax/speech-2.6-hd, fal-ai/kokoro/*
    • falTranscription() — speech-to-text via fal-ai/whisper, fal-ai/wizper, fal-ai/speech-to-text/turbo, fal-ai/elevenlabs/speech-to-text
    • falAudio() — music and sound-effect generation via fal-ai/minimax-music/v2.6, fal-ai/diffrhythm, fal-ai/lyria2, fal-ai/stable-audio-25/text-to-audio, fal-ai/elevenlabs/sound-effects/v2

Patch Changes

  • Tighten GeneratedImage and GeneratedAudio to enforce exactly one of url or b64Json via a mutually-exclusive GeneratedMediaSource union. (#463)

    Both types previously declared url? and b64Json? as independently optional, which allowed meaningless {} values and objects that set both fields. They now require exactly one:

    type GeneratedMediaSource =
      | { url: string; b64Json?: never }
      | { b64Json: string; url?: never }

    Existing read patterns like img.url || \data:image/png;base64,${img.b64Json}`continue to work unchanged. The only runtime-visible change is that the@tanstack/ai-openrouterand@tanstack/ai-falimage adapters no longer populateurlwith a synthesizeddata:image/png;base64,...URI when the provider returns base64 — they return{ b64Json }only. Consumers that want a data URI should build it fromb64Json` at render time.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 54523f5]:

    • @tanstack/ai@0.14.0

@tanstack/ai-gemini@0.10.0

Minor Changes

  • feat(ai-gemini): add Lyria 3 Pro / Clip audio adapter and Gemini 3.1 Flash TTS (#463)

    New adapter:

    • geminiAudio() for Google Lyria music generation — supports lyria-3-pro-preview (full-length songs, MP3/WAV 48 kHz stereo) and lyria-3-clip-preview (30-second MP3 clips)

    Enhanced:

    • Added gemini-3.1-flash-tts-preview to the TTS model list (70+ languages, 200+ audio tags for expressive control)
    • Added multiSpeakerVoiceConfig to GeminiTTSProviderOptions for 2-speaker dialogue generation

Patch Changes

  • Tighten GeneratedImage and GeneratedAudio to enforce exactly one of url or b64Json via a mutually-exclusive GeneratedMediaSource union. (#463)

    Both types previously declared url? and b64Json? as independently optional, which allowed meaningless {} values and objects that set both fields. They now require exactly one:

    type GeneratedMediaSource =
      | { url: string; b64Json?: never }
      | { b64Json: string; url?: never }

    Existing read patterns like img.url || \data:image/png;base64,${img.b64Json}`continue to work unchanged. The only runtime-visible change is that the@tanstack/ai-openrouterand@tanstack/ai-falimage adapters no longer populateurlwith a synthesizeddata:image/png;base64,...URI when the provider returns base64 — they return{ b64Json }only. Consumers that want a data URI should build it fromb64Json` at render time.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 54523f5]:

    • @tanstack/ai@0.14.0

@tanstack/ai-grok@0.7.0

Minor Changes

  • feat(ai-grok): add audio and speech adapters for xAI (#506)

    Add three new tree-shakeable adapters that wrap xAI's audio APIs:

    • grokSpeech / createGrokSpeech — text-to-speech via POST /v1/tts. Supports the 5 xAI voices (eve, ara, rex, sal, leo), MP3/WAV/PCM/μ-law/A-law codecs, and the language, sample_rate, bit_rate, optimize_streaming_latency, text_normalization provider options.
    • grokTranscription / createGrokTranscription — speech-to-text via POST /v1/stt. Passes through language, diarize, multichannel, channels, audio_format, and sample_rate; maps xAI's word-level timestamps to TranscriptionResult.words.
    • grokRealtime / grokRealtimeToken — Voice Agent (realtime) adapter for wss://api.x.ai/v1/realtime with ephemeral tokens via /v1/realtime/client_secrets. Supports the grok-voice-fast-1.0 and grok-voice-think-fast-1.0 models.

    New model identifier exports: GROK_TTS_MODELS, GROK_TRANSCRIPTION_MODELS, GROK_REALTIME_MODELS and their corresponding types.

Patch Changes

  • Tighten GeneratedImage and GeneratedAudio to enforce exactly one of url or b64Json via a mutually-exclusive GeneratedMediaSource union. (#463)

    Both types previously declared url? and b64Json? as independently optional, which allowed meaningless {} values and objects that set both fields. They now require exactly one:

    type GeneratedMediaSource =
      | { url: string; b64Json?: never }
      | { b64Json: string; url?: never }

    Existing read patterns like img.url || \data:image/png;base64,${img.b64Json}`continue to work unchanged. The only runtime-visible change is that the@tanstack/ai-openrouterand@tanstack/ai-falimage adapters no longer populateurlwith a synthesizeddata:image/png;base64,...URI when the provider returns base64 — they return{ b64Json }only. Consumers that want a data URI should build it fromb64Json` at render time.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 54523f5]:

    • @tanstack/ai@0.14.0

@tanstack/ai-react@0.8.0

Minor Changes

  • feat: add useGenerateAudio hook and streaming support for generateAudio() (#463)

    Closes the parity gap between audio generation and the other media
    activities (image, speech, video, transcription, summarize):

    • generateAudio() now accepts stream: true, returning an
      AsyncIterable<StreamChunk> that can be piped through
      toServerSentEventsResponse().
    • AudioGenerateInput type added to @tanstack/ai-client.
    • useGenerateAudio hook added to @tanstack/ai-react,
      @tanstack/ai-solid, and @tanstack/ai-vue; matching
      createGenerateAudio added to @tanstack/ai-svelte. All follow the same
      { generate, result, isLoading, error, status, stop, reset } shape as
      the existing media hooks and support both connection (SSE) and
      fetcher transports.

Patch Changes

  • fix(ai-react, ai-preact, ai-vue, ai-solid): propagate useChat callback changes (#465)

    onResponse, onChunk, and onCustomEvent were captured by reference at client creation time. When a parent component re-rendered with fresh closures, the ChatClient kept calling the originals. Every framework now wraps these callbacks so the latest options.xxx is read at call time (via optionsRef.current in React/Preact, and direct option access in Vue/Solid, matching the pattern already used for onFinish / onError). Clearing a callback (setting it to undefined) now correctly no-ops instead of continuing to invoke the stale handler.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 008f015, 54523f5]:

    • @tanstack/ai@0.14.0
    • @tanstack/ai-client@0.8.0

@tanstack/ai-solid@0.7.0

Minor Changes

  • feat: add useGenerateAudio hook and streaming support for generateAudio() (#463)

    Closes the parity gap between audio generation and the other media
    activities (image, speech, video, transcription, summarize):

    • generateAudio() now accepts stream: true, returning an
      AsyncIterable<StreamChunk> that can be piped through
      toServerSentEventsResponse().
    • AudioGenerateInput type added to @tanstack/ai-client.
    • useGenerateAudio hook added to @tanstack/ai-react,
      @tanstack/ai-solid, and @tanstack/ai-vue; matching
      createGenerateAudio added to @tanstack/ai-svelte. All follow the same
      { generate, result, isLoading, error, status, stop, reset } shape as
      the existing media hooks and support both connection (SSE) and
      fetcher transports.

Patch Changes

  • fix(ai-react, ai-preact, ai-vue, ai-solid): propagate useChat callback changes (#465)

    onResponse, onChunk, and onCustomEvent were captured by reference at client creation time. When a parent component re-rendered with fresh closures, the ChatClient kept calling the originals. Every framework now wraps these callbacks so the latest options.xxx is read at call time (via optionsRef.current in React/Preact, and direct option access in Vue/Solid, matching the pattern already used for onFinish / onError). Clearing a callback (setting it to undefined) now correctly no-ops instead of continuing to invoke the stale handler.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 008f015, 54523f5]:

    • @tanstack/ai@0.14.0
    • @tanstack/ai-client@0.8.0

@tanstack/ai-svelte@0.7.0

Minor Changes

  • feat: add useGenerateAudio hook and streaming support for generateAudio() (#463)

    Closes the parity gap between audio generation and the other media
    activities (image, speech, video, transcription, summarize):

    • generateAudio() now accepts stream: true, returning an
      AsyncIterable<StreamChunk> that can be piped through
      toServerSentEventsResponse().
    • AudioGenerateInput type added to @tanstack/ai-client.
    • useGenerateAudio hook added to @tanstack/ai-react,
      @tanstack/ai-solid, and @tanstack/ai-vue; matching
      createGenerateAudio added to @tanstack/ai-svelte. All follow the same
      { generate, result, isLoading, error, status, stop, reset } shape as
      the existing media hooks and support both connection (SSE) and
      fetcher transports.

Patch Changes

@tanstack/ai-vue@0.7.0

Minor Changes

  • feat: add useGenerateAudio hook and streaming support for generateAudio() (#463)

    Closes the parity gap between audio generation and the other media
    activities (image, speech, video, transcription, summarize):

    • generateAudio() now accepts stream: true, returning an
      AsyncIterable<StreamChunk> that can be piped through
      toServerSentEventsResponse().
    • AudioGenerateInput type added to @tanstack/ai-client.
    • useGenerateAudio hook added to @tanstack/ai-react,
      @tanstack/ai-solid, and @tanstack/ai-vue; matching
      createGenerateAudio added to @tanstack/ai-svelte. All follow the same
      { generate, result, isLoading, error, status, stop, reset } shape as
      the existing media hooks and support both connection (SSE) and
      fetcher transports.

Patch Changes

  • fix(ai-react, ai-preact, ai-vue, ai-solid): propagate useChat callback changes (#465)

    onResponse, onChunk, and onCustomEvent were captured by reference at client creation time. When a parent component re-rendered with fresh closures, the ChatClient kept calling the originals. Every framework now wraps these callbacks so the latest options.xxx is read at call time (via optionsRef.current in React/Preact, and direct option access in Vue/Solid, matching the pattern already used for onFinish / onError). Clearing a callback (setting it to undefined) now correctly no-ops instead of continuing to invoke the stale handler.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 008f015, 54523f5]:

    • @tanstack/ai@0.14.0
    • @tanstack/ai-client@0.8.0

@tanstack/ai-anthropic@0.8.2

Patch Changes

@tanstack/ai-code-mode@0.1.8

Patch Changes

@tanstack/ai-code-mode-skills@0.1.8

Patch Changes

@tanstack/ai-devtools-core@0.3.25

Patch Changes

@tanstack/ai-elevenlabs@0.1.8

Patch Changes

@tanstack/ai-event-client@0.2.8

Patch Changes

@tanstack/ai-groq@0.1.8

Patch Changes

@tanstack/ai-isolate-cloudflare@0.1.8

Patch Changes

  • feat(ai-isolate-cloudflare): support production deployments and close tool-name injection vector (#465)

    The Worker now documents production-capable unsafe_eval usage (previously the code, wrangler.toml, and README all described it as dev-only). Tool names are validated against a strict identifier regex before being interpolated into the generated wrapper code, so a malicious tool name like foo'); process.exit(1); (function bar() { is rejected at generation time rather than breaking out of the wrapping function.

  • Updated dependencies []:

    • @tanstack/ai-code-mode@0.1.8

@tanstack/ai-isolate-node@0.1.8

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-code-mode@0.1.8

@tanstack/ai-isolate-quickjs@0.1.8

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-code-mode@0.1.8

@tanstack/ai-ollama@0.6.10

Patch Changes

  • refactor(ai-ollama): extract tool conversion into src/tools/ matching peer adapters (#465)

    Tool handling lived inline inside the text adapter with raw type casts. It is now split into a dedicated tool-converter.ts / function-tool.ts pair (mirroring the structure used by ai-openai, ai-anthropic, ai-grok, and ai-groq) and re-exported from the package index as convertFunctionToolToAdapterFormat and convertToolsToProviderFormat. Runtime behavior is unchanged.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 54523f5]:

    • @tanstack/ai@0.14.0

@tanstack/ai-openai@0.8.2

Patch Changes

  • refactor(ai, ai-openai): narrow error handling before logging (#465)

    catch (error: any) sites in stream-to-response.ts, activities/stream-generation-result.ts, and activities/generateVideo/index.ts are now narrowed to unknown and funnel through a shared toRunErrorPayload(error, fallback) helper that extracts message / code without leaking the original error object (which can carry request state from an SDK).

    Replaced four console.error calls in the OpenAI text adapter's chatStream catch block that dumped the full error object to stdout. SDK errors can carry the original request including auth headers, so the library now logs only the narrowed { message, code } payload via the internal logger — any user-supplied logger receives the sanitized shape, not the raw SDK error.

  • Tighten GeneratedImage and GeneratedAudio to enforce exactly one of url or b64Json via a mutually-exclusive GeneratedMediaSource union. (#463)

    Both types previously declared url? and b64Json? as independently optional, which allowed meaningless {} values and objects that set both fields. They now require exactly one:

    type GeneratedMediaSource =
      | { url: string; b64Json?: never }
      | { b64Json: string; url?: never }

    Existing read patterns like img.url || \data:image/png;base64,${img.b64Json}`continue to work unchanged. The only runtime-visible change is that the@tanstack/ai-openrouterand@tanstack/ai-falimage adapters no longer populateurlwith a synthesizeddata:image/png;base64,...URI when the provider returns base64 — they return{ b64Json }only. Consumers that want a data URI should build it fromb64Json` at render time.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 008f015, 54523f5]:

    • @tanstack/ai@0.14.0
    • @tanstack/ai-client@0.8.0

@tanstack/ai-openrouter@0.8.2

Patch Changes

  • Tighten GeneratedImage and GeneratedAudio to enforce exactly one of url or b64Json via a mutually-exclusive GeneratedMediaSource union. (#463)

    Both types previously declared url? and b64Json? as independently optional, which allowed meaningless {} values and objects that set both fields. They now require exactly one:

    type GeneratedMediaSource =
      | { url: string; b64Json?: never }
      | { b64Json: string; url?: never }

    Existing read patterns like img.url || \data:image/png;base64,${img.b64Json}`continue to work unchanged. The only runtime-visible change is that the@tanstack/ai-openrouterand@tanstack/ai-falimage adapters no longer populateurlwith a synthesizeddata:image/png;base64,...URI when the provider returns base64 — they return{ b64Json }only. Consumers that want a data URI should build it fromb64Json` at render time.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 54523f5]:

    • @tanstack/ai@0.14.0

@tanstack/ai-preact@0.6.20

Patch Changes

  • fix(ai-react, ai-preact, ai-vue, ai-solid): propagate useChat callback changes (#465)

    onResponse, onChunk, and onCustomEvent were captured by reference at client creation time. When a parent component re-rendered with fresh closures, the ChatClient kept calling the originals. Every framework now wraps these callbacks so the latest options.xxx is read at call time (via optionsRef.current in React/Preact, and direct option access in Vue/Solid, matching the pattern already used for onFinish / onError). Clearing a callback (setting it to undefined) now correctly no-ops instead of continuing to invoke the stale handler.

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 008f015, 54523f5]:

    • @tanstack/ai@0.14.0
    • @tanstack/ai-client@0.8.0

@tanstack/ai-react-ui@0.6.2

Patch Changes

@tanstack/ai-solid-ui@0.6.2

Patch Changes

@tanstack/ai-vue-ui@0.1.31

Patch Changes

@tanstack/preact-ai-devtools@0.1.29

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-devtools-core@0.3.25

@tanstack/react-ai-devtools@0.2.29

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-devtools-core@0.3.25

@tanstack/solid-ai-devtools@0.2.29

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-devtools-core@0.3.25

ts-svelte-chat@0.1.37

Patch Changes

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 008f015, 54523f5, 54523f5, af9eb7b]:
    • @tanstack/ai@0.14.0
    • @tanstack/ai-client@0.8.0
    • @tanstack/ai-svelte@0.7.0
    • @tanstack/ai-openai@0.8.2
    • @tanstack/ai-gemini@0.10.0
    • @tanstack/ai-ollama@0.6.10
    • @tanstack/ai-anthropic@0.8.2

ts-vue-chat@0.1.37

Patch Changes

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 008f015, 54523f5, 54523f5, af9eb7b, af9eb7b]:
    • @tanstack/ai@0.14.0
    • @tanstack/ai-client@0.8.0
    • @tanstack/ai-vue@0.7.0
    • @tanstack/ai-openai@0.8.2
    • @tanstack/ai-gemini@0.10.0
    • @tanstack/ai-ollama@0.6.10
    • @tanstack/ai-anthropic@0.8.2
    • @tanstack/ai-vue-ui@0.1.31

vanilla-chat@0.0.35

Patch Changes

@tanstack/ai-code-mode-models-eval@0.0.11

Patch Changes

  • Updated dependencies [54523f5, 54523f5, af9eb7b, 54523f5, 54523f5, 2e4c942, af9eb7b]:
    • @tanstack/ai@0.14.0
    • @tanstack/ai-openai@0.8.2
    • @tanstack/ai-gemini@0.10.0
    • @tanstack/ai-grok@0.7.0
    • @tanstack/ai-ollama@0.6.10
    • @tanstack/ai-anthropic@0.8.2
    • @tanstack/ai-code-mode@0.1.8
    • @tanstack/ai-groq@0.1.8
    • @tanstack/ai-isolate-node@0.1.8

@github-actions github-actions Bot force-pushed the changeset-release/main branch 3 times, most recently from 845bc49 to ad562ab Compare April 24, 2026 11:34
@github-actions github-actions Bot force-pushed the changeset-release/main branch from ad562ab to 8eff30c Compare April 24, 2026 12:56
@AlemTuzlak AlemTuzlak merged commit af19dcc into main Apr 24, 2026
@AlemTuzlak AlemTuzlak deleted the changeset-release/main branch April 24, 2026 13:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant