Skip to content

Add MiniMax model support (M2.5 + M2.7) via OpenAI-compatible API#7393

Open
octo-patch wants to merge 2 commits intomicrosoft:mainfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax model support (M2.5 + M2.7) via OpenAI-compatible API#7393
octo-patch wants to merge 2 commits intomicrosoft:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 13, 2026

Summary

This PR adds first-class support for MiniMax models to AutoGen's OpenAI-compatible client, following the same pattern used for Gemini, Anthropic, and Llama providers.

Supported Models

Model Description
MiniMax-M2.7 Latest flagship model, 204K context, vision + function calling + structured output
MiniMax-M2.7-highspeed Same capabilities as M2.7 with optimized inference speed
MiniMax-M2.5 Previous generation, 204K context
MiniMax-M2.5-highspeed Same capabilities as M2.5 with optimized inference speed

Changes

  • autogen-core: Add MINIMAX_M2_5 and MINIMAX_M2_7 to ModelFamily with is_minimax() helper method
  • autogen-ext (OpenAI provider):
    • Register MiniMax models in _model_info.py with capabilities and 204K token limits
    • Add MINIMAX_API_BASE_URL constant (https://api.minimax.io/v1)
    • Auto-detect MINIMAX_API_KEY env var and auto-configure base URL for MiniMax-* models
  • Docs: Add MiniMax tutorial section in the models notebook

Usage

Test Plan

  • Verified M2.7 and M2.7-highspeed models respond correctly via API
  • Syntax validation passes on all modified files
  • Notebook JSON is valid

Add MiniMax M2.5 and M2.5-highspeed models to the OpenAI client,
following the same pattern used for Gemini, Anthropic, and Llama
providers. Changes include:

- Add MINIMAX_M2_5 to ModelFamily with is_minimax() helper
- Register MiniMax models in _model_info.py (capabilities, token limits)
- Auto-detect MiniMax model prefix for base_url and MINIMAX_API_KEY
- Add MiniMax section to models tutorial documentation
@octo-patch
Copy link
Author

@microsoft-github-policy-service agree

Upgrade MiniMax integration to include the latest M2.7 flagship models
alongside existing M2.5 models. M2.7 offers the same 204K context window,
vision, function calling, and structured output capabilities.

Co-Authored-By: Octopus <liyuan851277048@icloud.com>
@octo-patch octo-patch changed the title Add MiniMax model support via OpenAI-compatible API Add MiniMax model support (M2.5 + M2.7) via OpenAI-compatible API Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant