You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Vision capabilities enable models to analyze images and provide insights based on visual content in addition to text. This multimodal approach opens up new possibilities for applications that require both textual and visual understanding.
13090
13100
13091
-
For more specific use cases regarding document parsing and data extraction we recommend taking a look at our Document AI stack [here](../OCR/document_ai_overview).
13101
+
For more specific use cases regarding document parsing and data extraction we recommend taking a look at our Document AI stack [here](../document_ai/document_ai_overview).
13092
13102
13093
13103
## Models with Vision Capabilities:
13094
13104
- Pixtral 12B (`pixtral-12b-latest`)
@@ -13739,7 +13749,10 @@ in two ways:
13739
13749
This page focuses on the MaaS offering, where the following models are available:
13740
13750
13741
13751
- Mistral Large (24.11, 24.07)
13742
-
- Mistral Small (24.09)
13752
+
- Mistral Medium (25.05)
13753
+
- Mistral Small (25.03)
13754
+
- Mistral Document AI (25.05)
13755
+
- Mistral OCR (25.05)
13743
13756
- Ministral 3B (24.10)
13744
13757
- Mistral Nemo
13745
13758
@@ -13843,9 +13856,11 @@ To run the examples below, set the following environment variables:
13843
13856
## Going further
13844
13857
13845
13858
For more details and examples, refer to the following resources:
13859
+
- [Release blog post for Mistral Document AI](https://techcommunity.microsoft.com/blog/aiplatformblog/deepening-our-partnership-with-mistral-ai-on-azure-ai-foundry/4434656)
13846
13860
- [Release blog post for Mistral Large 2 and Mistral NeMo](https://techcommunity.microsoft.com/t5/ai-machine-learning-blog/ai-innovation-continues-introducing-mistral-large-2-and-mistral/ba-p/4200181).
13847
13861
- [Azure documentation for MaaS deployment of Mistral models](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-mistral).
13848
13862
- [Azure ML examples GitHub repository](https://github.com/Azure/azureml-examples/tree/main/sdk/python/foundation-models/mistral) with several Mistral-based samples.
13863
+
- [Azure AI Foundry GitHub repository](https://github.com/azure-ai-foundry/foundry-samples/tree/main/samples/mistral)
13849
13864
13850
13865
13851
13866
[IBM watsonx.ai]
@@ -14089,7 +14104,7 @@ To run the examples below you will need to set the following environment variabl
14089
14104
14090
14105
Codestral can be queried using an additional completion mode called fill-in-the-middle (FIM).
| Mistral Medium 3 | | :heavy_check_mark: | Our frontier-class multimodal model released May 2025. Learn more in our [blog post](https://mistral.ai/news/mistral-medium-3/) | 128k | `mistral-medium-2505` | 25.05|
16216
+
| Mistral Medium 3.1 | | :heavy_check_mark: | Our frontier-class multimodal model released August 2025. Improving tone and performance. Read more about Medium 3 in our [blog post](https://mistral.ai/news/mistral-medium-3/) | 128k | `mistral-medium-2508` | 25.08|
16202
16217
| Magistral Medium 1.1 | | :heavy_check_mark: | Our frontier-class reasoning model released July 2025. | 40k | `magistral-medium-2507` | 25.07|
16203
16218
| Codestral 2508 | | :heavy_check_mark: | Our cutting-edge language model for coding released end of July 2025, Codestral specializes in low-latency, high-frequency tasks such as fill-in-the-middle (FIM), code correction and test generation. Learn more in our [blog post](https://mistral.ai/news/codestral-25-08/) | 256k | `codestral-2508` | 25.08|
16204
16219
| Voxtral Mini Transcribe | | :heavy_check_mark: | An efficient audio input model, fine-tuned and optimized for transcription purposes only. | | `voxtral-mini-2507` via `audio/transcriptions` | 25.07|
@@ -16207,6 +16222,7 @@ Mistral provides two types of models: open models and premier models.
16207
16222
| Magistral Medium 1 | | :heavy_check_mark: | Our first frontier-class reasoning model released June 2025. Learn more in our [blog post](https://mistral.ai/news/magistral/) | 40k | `magistral-medium-2506` | 25.06|
16208
16223
| Ministral 3B | | :heavy_check_mark: | World’s best edge model. Learn more in our [blog post](https://mistral.ai/news/ministraux/) | 128k | `ministral-3b-2410` | 24.10|
16209
16224
| Ministral 8B | :heavy_check_mark: <br/> [Mistral Research License](https://mistral.ai/licenses/MRL-0.1.md)| :heavy_check_mark: |Powerful edge model with extremely high performance/price ratio. Learn more in our [blog post](https://mistral.ai/news/ministraux/) | 128k | `ministral-8b-2410` | 24.10|
16225
+
| Mistral Medium 3 | | :heavy_check_mark: | Our frontier-class multimodal model released May 2025. Learn more in our [blog post](https://mistral.ai/news/mistral-medium-3/) | 128k | `mistral-medium-2505` | 25.05|
16210
16226
| Codestral 2501 | | :heavy_check_mark: | Our cutting-edge language model for coding with the second version released January 2025, Codestral specializes in low-latency, high-frequency tasks such as fill-in-the-middle (FIM), code correction and test generation. Learn more in our [blog post](https://mistral.ai/news/codestral-2501/) | 256k | `codestral-2501` | 25.01|
16211
16227
| Mistral Large 2.1 |:heavy_check_mark: <br/> [Mistral Research License](https://mistral.ai/licenses/MRL-0.1.md)| :heavy_check_mark: | Our top-tier large model for high-complexity tasks with the lastest version released November 2024. Learn more in our [blog post](https://mistral.ai/news/pixtral-large/) | 128k | `mistral-large-2411` | 24.11|
16212
16228
| Pixtral Large |:heavy_check_mark: <br/> [Mistral Research License](https://mistral.ai/licenses/MRL-0.1.md)| :heavy_check_mark: | Our first frontier-class multimodal model released November 2024. Learn more in our [blog post](https://mistral.ai/news/pixtral-large/) | 128k | `pixtral-large-2411` | 24.11|
@@ -16241,8 +16257,8 @@ Additionally, be prepared for the deprecation of certain endpoints in the coming
16241
16257
Here are the details of the available versions:
16242
16258
- `magistral-medium-latest`: currently points to `magistral-medium-2507`.
16243
16259
- `magistral-small-latest`: currently points to `magistral-small-2507`.
16244
-
- `mistral-medium-latest`: currently points to `mistral-medium-2505`.
16245
-
- `mistral-large-latest`: currently points to `mistral-large-2411`.
16260
+
- `mistral-medium-latest`: currently points to `mistral-medium-2508`.
16261
+
- `mistral-large-latest`: currently points to `mistral-medium-2508`, previously `mistral-large-2411`.
16246
16262
- `pixtral-large-latest`: currently points to `pixtral-large-2411`.
16247
16263
- `mistral-moderation-latest`: currently points to `mistral-moderation-2411`.
16248
16264
- `ministral-3b-latest`: currently points to `ministral-3b-2410`.
@@ -18984,6 +19000,24 @@ Here is an [example notebook](https://github.com/mistralai/cookbook/blob/main/th
Maxim AI provides comprehensive observability for your Mistral based AI applications. With Maxim's one-line integration, you can easily trace and analyse LLM calls, metrics, and more.
19006
+
19007
+
**Pros:**
19008
+
19009
+
* Performance Analytics: Track latency, tokens consumed, and costs
19010
+
* Advanced Visualisation: Understand agent trajectories through intuitive dashboards
19011
+
19012
+
**Mistral integration Example:**
19013
+
19014
+
* Learn how to integrate Maxim observability with the Mistral SDK in just one line of code - [Colab Notebook](https://github.com/mistralai/cookbook/blob/main/third_party/Maxim/cookbook_maxim_mistral_integration.ipynb)
19015
+
19016
+
Maxim Documentation to use Mistral as an LLM Provider and Maxim as Logger - [Docs Link](https://www.getmaxim.ai/docs/sdk/python/integrations/mistral/mistral)
@@ -20736,18 +20770,3 @@ Mistral AI's LLM API endpoints charge based on the number of tokens in the input
20736
20770
20737
20771
To help you estimate your costs, our tokenization API makes it easy to count the number of tokens in your text. Simply run `len(tokens)` as shown in the example above to get the total number of tokens in the text, which you can then use to estimate your cost based on our pricing information.
20738
20772
20739
-
20740
-
[Mistral AI Crawlers]
20741
-
Source: https://docs.mistral.ai/docs/robots
20742
-
20743
-
## Mistral AI Crawlers
20744
-
20745
-
Mistral AI employs web crawlers ("robots") and user agents to execute tasks for its products, either automatically or upon user request. To facilitate webmasters in managing how their sites and content interact with AI, Mistral AI utilizes specific robots.txt tags.
20746
-
20747
-
### MistralAI-User
20748
-
20749
-
MistralAI-User is for user actions in LeChat. When users ask LeChat a question, it may visit a web page to help answer and include a link to the source in its response. MistralAI-User governs which sites these user requests can be made to. It is not used for crawling the web in any automatic fashion, nor to crawl content for generative AI training.
20750
-
20751
-
Full user-agent string: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; MistralAI-User/1.0; +https://docs.mistral.ai/robots)
20752
-
20753
-
Published IP addresses: https://mistral.ai/mistralai-user-ips.json
0 commit comments