Skip to content

feat(mistral): migrate LLM to Conversations API with provider tools support#5527

Merged
tinalenguyen merged 6 commits intolivekit:mainfrom
jeanprbt:jean/feat/add-mistral-provider-tools
Apr 22, 2026
Merged

feat(mistral): migrate LLM to Conversations API with provider tools support#5527
tinalenguyen merged 6 commits intolivekit:mainfrom
jeanprbt:jean/feat/add-mistral-provider-tools

Conversation

@jeanprbt
Copy link
Copy Markdown
Contributor

Summary

  • Migrates the Mistral AI LLM plugin from the Chat Completions API to the Conversations API,
    enabling support for Mistral's built-in provider tools: Web Search, Code Interpreter, and Document Library
  • Adds MistralTool base class with WebSearch, CodeInterpreter, and DocumentLibrary provider tool definitions
  • Extends LLM constructor with full completion parameter support (top_p, presence_penalty, frequency_penalty, random_seed, tool_choice)
  • Switches default TTS response format from pcm to mp3

Details

Conversations API migration

The Conversations API is stateful — the server retains conversation history across calls. This enables:

  • First call: start_stream_async() sends the full context, model, instructions, and tools
  • Subsequent calls: append_stream_async() sends only new client-originated entries (function.result, message.input), avoiding redundant re-transmission of the full history

Context diffing uses ChatContext.is_equivalent() (following the OpenAI Responses API pattern) to detect whether the new context extends the previous one. If it diverges, the conversation is reset with a fresh start_stream_async().

A pending tool call verification safeguard ensures all function calls from the previous response have results before attempting to append.

Streaming event handling

The Conversations API uses typed SSE events instead of choice deltas.

Event Handling
ResponseStartedEvent Captures conversation_id for stateful tracking
MessageOutputEvent Emits ChatChunk with text content
FunctionCallEvent Accumulated in _PendingFunctionCall buffer, flushed as complete FunctionToolCall on next non-function event (since arguments are sent delta-wise incrementally)
ToolExecution{Started,Delta,Done}Event Provider tool execution (server-side) — arguments accumulated silently, logged once on completion
ResponseDoneEvent Extracts token usage
ResponseErrorEvent Raises APIStatusError

Provider format converter

Replaces the old to_chat_ctx() (which delegated to OpenAI format) with to_conversations_ctx() that produces Mistral's flat entry format:

  • System/developer messages → extracted as instructions string (returned via MistralFormatData)
  • User messages → {"type": "message.input", ...}
  • Assistant messages → {"type": "message.output", ...}
  • Function calls → {"type": "function.call", ...}
  • Function results → {"type": "function.result", ...}

LLM completion parameters

The LLM constructor and update_options() now accept the full set of CompletionArgs parameters supported by the Conversations API: temperature, top_p, max_completion_tokens, presence_penalty, frequency_penalty, random_seed, and tool_choice.

TTS default response format

Switches the default TTS response format from pcm back to mp3, since it is the most reliable default for most setups.

Usage

from livekit.plugins import mistralai

llm = mistralai.LLM(model="mistral-medium-latest")

# With provider tools
agent = Agent(
    llm=llm,
    tools=[
        mistralai.tools.WebSearch(),
        mistralai.tools.CodeInterpreter(),
        mistralai.tools.DocumentLibrary(library_ids=["<your-library-id>"]),
    ],
)

@jeanprbt jeanprbt changed the title feat( mistral): migrate LLM to Conversations API with provider tools support feat(mistral): migrate LLM to Conversations API with provider tools support Apr 22, 2026
devin-ai-integration[bot]

This comment was marked as resolved.

devin-ai-integration[bot]

This comment was marked as resolved.

devin-ai-integration[bot]

This comment was marked as resolved.

Copy link
Copy Markdown
Contributor

@devin-ai-integration devin-ai-integration Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 3 new potential issues.

View 13 additional findings in Devin Review.

Open in Devin Review

Comment thread livekit-agents/livekit/agents/llm/_provider_format/mistralai.py
@tinalenguyen tinalenguyen merged commit 10d538d into livekit:main Apr 22, 2026
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants