Skip to content

feat: add MiniMax provider support#6

Merged
macOS26 merged 1 commit intomacOS26:mainfrom
octo-patch:feature/add-minimax-provider
Apr 17, 2026
Merged

feat: add MiniMax provider support#6
macOS26 merged 1 commit intomacOS26:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Adds MiniMax as an OpenAI-compatible LLM provider (chat models only)
  • Default model: MiniMax-M2.7, also supports MiniMax-M2.7-highspeed
  • API endpoint: https://api.minimax.io/v1 (OpenAI-compatible protocol)
  • Temperature default: 1.0 (MiniMax requires > 0.0)
  • Context window: 1,000,000 tokens
  • API key stored securely in Keychain under com.agent.minimax-api-key

Note: This PR depends on macOS26/AgentTools#1 which adds the APIProvider.miniMax enum case to the shared package.

Files changed

File Change
Agent/Services/LLMProviderSetup.swift Register MiniMax provider config
Agent/Services/KeychainService.swift Add Keychain get/set for MiniMax API key
Agent/AgentViewModel/Core/AgentViewModel.swift Add MiniMax key, model, models, temperature properties
Agent/AgentViewModel/Features/DefaultModels.swift Add default MiniMax model list
Agent/AgentViewModel/Features/ModelFetching.swift Add fetchMiniMaxModels() + fetchModelsIfNeeded case
Agent/AgentViewModel/Features/ScriptTabs.swift Wire up globalModelForProvider, apiKeyForProvider, modelDisplayName
Agent/AgentViewModel/Core/Colors.swift Add temperature binding
Agent/AgentViewModel/TaskExecution/Setup.swift Add provider/model/vision resolution
Agent/Views/Settings/SettingsView.swift Add MiniMax settings section (API key + model picker)
Agent/Views/Tabs/NewMainTabSheet.swift Add MiniMax to new tab model picker
Agent/Views/Settings/FallbackChainView.swift Add MiniMax to fallback chain default model
Agent/Views/Output/ThinkingIndicatorView.swift Add 1M context window for MiniMax

Test plan

  • Build succeeds with no Swift compiler errors (all exhaustive switch statements updated)
  • MiniMax section appears in Settings under LLM Providers
  • API key saves/loads correctly from Keychain
  • Model picker shows MiniMax-M2.7 and MiniMax-M2.7-highspeed
  • MiniMax appears as an option in New LLM Tab sheet
  • MiniMax appears in Fallback Chain provider picker
  • Sending a chat message to MiniMax returns a response

Add MiniMax as an OpenAI-compatible LLM provider:
- Models: MiniMax-M2.7 (default) and MiniMax-M2.7-highspeed
- API: https://api.minimax.io/v1 (OpenAI-compatible)
- Temperature default: 1.0 (MiniMax requires > 0.0)
- Context window: 1,000,000 tokens
- Keychain storage for MINIMAX_API_KEY
- Full UI: API key field, model picker with refresh, settings section

Depends on macOS26/AgentTools#1 for the APIProvider.miniMax enum case.
@macOS26 macOS26 merged commit 212b13a into macOS26:main Apr 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants