Skip to content

Add support for new Mistral Magistral models (magistral-medium-2506 and magistral-small-2506) #11588

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 10, 2025

Conversation

colesmcintosh
Copy link
Collaborator

Title

Add support for new Mistral Magistral models (magistral-medium-2506 and magistral-small-2506)

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Changes

This PR adds support for two new Mistral Magistral models by updating the model_prices_and_context_window.json file:

Added Models:

  • mistral/magistral-medium-2506:

    • Input cost: $2.00 per 1M tokens
    • Output cost: $5.00 per 1M tokens
    • Max tokens: 40,000 (input/output)
    • Supports function calling, assistant prefill, and tool choice
  • mistral/magistral-small-2506:

    • Free model (no cost)
    • Max tokens: 40,000 (input/output)
    • Supports function calling, assistant prefill, and tool choice

Both models are configured with:

Technical Details:

  • Updated model_prices_and_context_window.json with pricing and capability information
  • Follows existing patterns for Mistral model configurations
  • Maintains consistency with other model definitions in the file

…al/magistral-medium-2506' and 'mistral/magistral-small-2506' with token limits and cost details
Copy link

vercel bot commented Jun 10, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 10, 2025 2:43pm

@colesmcintosh colesmcintosh marked this pull request as draft June 10, 2025 15:22
@colesmcintosh colesmcintosh marked this pull request as ready for review June 10, 2025 18:19
Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@colesmcintosh colesmcintosh merged commit 74bf901 into BerriAI:main Jun 10, 2025
6 checks passed
"max_tokens": 40000,
"max_input_tokens": 40000,
"max_output_tokens": 40000,
"input_cost_per_token": 0.0,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this meant to be 0?

"output_cost_per_token": 0.0,
"litellm_provider": "mistral",
"mode": "chat",
"source": "https://mistral.ai/news/magistral",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't see the pricing on this link. Do you mind sharing what you see?

Trying to validate the magistral small pricing

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here's what i see
Screenshot 2025-06-14 at 3 46 45 PM

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think they just changed this, was free on the release date

@colesmcintosh colesmcintosh deleted the add-magistral-sm-md branch June 24, 2025 16:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants