feat: Add Gemini 3 Pro support with thinking_level parameter#47
feat: Add Gemini 3 Pro support with thinking_level parameter#47Kamilbenkirane merged 1 commit intomainfrom
Conversation
- Add gemini-3-pro-preview model with 65,536 output token limit - Add thinking_level parameter (low/high) for Gemini 3 - Update all Gemini 2.5 models to correct 65,536 output token limits - Add ThinkingLevelMapper for Google provider - Bump version to 0.2.9
Pull Request Review: Gemini 3 Pro SupportSummaryThis PR adds support for Google's Gemini 3 Pro model with the ✅ StrengthsCode Quality
Architecture
🔍 Observations & Suggestions1. Missing Test Coverage
|
Changes
gemini-3-pro-previewmodel with 65,536 output token limitthinking_levelparameter (low/high) for Gemini 3 ProThinkingLevelMapperfor Google providerTesting
All pre-commit hooks passed:
References