-
Notifications
You must be signed in to change notification settings - Fork 284
Update for Gemini 3 #203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update for Gemini 3 #203
Conversation
WalkthroughUpdates the Gemini plugin to support Gemini 3 models with version bumps, new configuration options (thinking_level, media_resolution), and a DEFAULT_MODEL constant. Documentation includes migration guidance, and example code adopts default LLM initialization. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes
Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
examples/01_simple_agent_example/simple_agent_example.py (1)
21-21: Update outdated model reference in comment.The comment references "gemini-2.5-flash-lite", but the code now uses the default model "gemini-3-pro-preview".
Apply this diff:
-- gemini-2.5-flash-lite for fast responses +- gemini-3-pro-preview for fast responses
🧹 Nitpick comments (2)
plugins/gemini/vision_agents/plugins/gemini/gemini_llm.py (2)
86-92: Clarify config and kwargs precedence.If both
configand**kwargsare provided,kwargsare silently ignored. This could confuse users who expect kwargs to extend or override the config.Consider either:
- Documenting this precedence clearly in the docstring
- Raising an error if both are provided
- Merging kwargs into the provided config
Example docstring addition:
config: Optional[GenerateContentConfig] to use as base. Any kwargs will be passed to GenerateContentConfig constructor if config is not provided. + Note: If config is provided, kwargs are ignored. **kwargs: Additional arguments passed to GenerateContentConfig constructor. + Only used if config is not provided.
169-187: Simplify config handling logic for better maintainability.The config handling has multiple conditional branches that are difficult to follow. The logic is functionally correct but could be clearer.
Consider consolidating the logic:
# Add tools if available - Gemini uses GenerateContentConfig tools_spec = self.get_available_functions() + + # Get or build base config + cfg = kwargs.get("config") + if not isinstance(cfg, GenerateContentConfig): + cfg = self._build_config() + else: + cfg = self._build_config(base_config=cfg) + + # Add tools if available if tools_spec: conv_tools = self._convert_tools_to_provider_format(tools_spec) - cfg = kwargs.get("config") - if not isinstance(cfg, GenerateContentConfig): - cfg = self._build_config() - else: - cfg = self._build_config(base_config=cfg) cfg.tools = conv_tools # type: ignore[assignment] - kwargs["config"] = cfg - else: - cfg = kwargs.get("config") - if cfg is None or not isinstance(cfg, GenerateContentConfig): - cfg = self._build_config() - kwargs["config"] = cfg - elif self.thinking_level or self.media_resolution: - cfg = self._build_config(base_config=cfg) - kwargs["config"] = cfg + + kwargs["config"] = cfg
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (5)
examples/01_simple_agent_example/simple_agent_example.py(1 hunks)plugins/gemini/README.md(2 hunks)plugins/gemini/pyproject.toml(1 hunks)plugins/gemini/vision_agents/plugins/gemini/__init__.py(1 hunks)plugins/gemini/vision_agents/plugins/gemini/gemini_llm.py(4 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
examples/01_simple_agent_example/simple_agent_example.py (2)
plugins/gemini/tests/test_gemini_llm.py (1)
llm(31-34)agents-core/vision_agents/core/llm/llm.py (1)
LLM(49-418)
plugins/gemini/vision_agents/plugins/gemini/gemini_llm.py (2)
agents-core/vision_agents/core/llm/llm.py (2)
get_available_functions(227-229)_convert_tools_to_provider_format(122-136)plugins/gemini/vision_agents/plugins/gemini/gemini_realtime.py (1)
_convert_tools_to_provider_format(448-470)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: unit / Test "not integration"
- GitHub Check: unit / Ruff
- GitHub Check: unit / Mypy
- GitHub Check: unit / Test "not integration"
- GitHub Check: unit / Ruff
- GitHub Check: unit / Mypy
🔇 Additional comments (9)
examples/01_simple_agent_example/simple_agent_example.py (1)
29-29: LGTM! Simplified initialization using defaults.The change to use
gemini.LLM()without explicit model specification leverages the new DEFAULT_MODEL constant, making the example cleaner.plugins/gemini/vision_agents/plugins/gemini/__init__.py (1)
3-5: LGTM! API expansion for Gemini 3 configuration.Exposing
ThinkingLevelandMediaResolutionenables users to configure these options when initializingGeminiLLM, as documented in the migration guide.plugins/gemini/README.md (2)
8-8: LGTM! Updated installation instruction.The installation command now uses the Vision Agents extras syntax, which is clearer and more maintainable.
113-120: LGTM! Helpful migration guidance for Gemini 3.The migration section provides clear, actionable advice for users upgrading from Gemini 2.5, covering thinking level, temperature, PDF handling, and token consumption considerations.
plugins/gemini/vision_agents/plugins/gemini/gemini_llm.py (4)
216-275: LGTM! Robust multi-hop tool calling with safeguards.The implementation includes important safeguards:
- MAX_ROUNDS limit prevents infinite loops
- Deduplication prevents redundant tool executions
- Output sanitization prevents oversized responses
- Proper error handling in the execution flow
341-362: LGTM! Correct tool conversion to Gemini format.The implementation properly converts ToolSchema objects to Gemini's function_declarations format, consistent with the pattern used in gemini_realtime.py.
364-428: LGTM! Defensive tool call extraction with fallbacks.The implementation includes appropriate fallbacks and error handling:
- Tries top-level
function_callsfirst- Falls back to
candidates.content.partsstructure- Silently ignores extraction errors to prevent disruption
430-461: LGTM! Robust tool result handling with fallback.The method properly handles various result formats and includes a fallback mechanism to create text parts if serialization fails.
plugins/gemini/pyproject.toml (1)
15-15: google-genai version 1.51.0 verified as available and secure.Version 1.51.0 exists on PyPI and has no known security advisories in the GitHub vulnerability database, confirming the dependency specification is valid and poses no immediate security concerns.
| def _build_config( | ||
| self, | ||
| system_instruction: Optional[str] = None, | ||
| base_config: Optional[GenerateContentConfig] = None, | ||
| ) -> GenerateContentConfig: | ||
| """ | ||
| Build GenerateContentConfig with Gemini 3 features. | ||
| Args: | ||
| system_instruction: Optional system instruction to include | ||
| base_config: Optional base config to extend (takes precedence over self._base_config) | ||
| Returns: | ||
| GenerateContentConfig with thinking_level and media_resolution if set | ||
| """ | ||
| if base_config is not None: | ||
| config = base_config | ||
| elif self._base_config is not None: | ||
| config = self._base_config | ||
| else: | ||
| config = GenerateContentConfig() | ||
|
|
||
| if system_instruction: | ||
| config.system_instruction = system_instruction | ||
|
|
||
| if self.thinking_level: | ||
| from google.genai.types import ThinkingConfig | ||
|
|
||
| config.thinking_config = ThinkingConfig(thinking_level=self.thinking_level) | ||
|
|
||
| if self.media_resolution: | ||
| config.media_resolution = self.media_resolution | ||
|
|
||
| return config |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Config mutation risk: shared config objects may be unexpectedly modified.
The method mutates the base_config or self._base_config object in place. If a user passes a GenerateContentConfig to the constructor and calls send_message multiple times, the same config object is mutated repeatedly. This could lead to unexpected behavior if the config is shared or reused elsewhere.
Consider creating a copy of the config before mutating:
def _build_config(
self,
system_instruction: Optional[str] = None,
base_config: Optional[GenerateContentConfig] = None,
) -> GenerateContentConfig:
"""
Build GenerateContentConfig with Gemini 3 features.
Args:
system_instruction: Optional system instruction to include
base_config: Optional base config to extend (takes precedence over self._base_config)
Returns:
GenerateContentConfig with thinking_level and media_resolution if set
"""
if base_config is not None:
- config = base_config
+ # Create a copy to avoid mutating the caller's config
+ config = GenerateContentConfig(**base_config.__dict__)
elif self._base_config is not None:
- config = self._base_config
+ # Create a copy to avoid mutating stored config
+ config = GenerateContentConfig(**self._base_config.__dict__)
else:
config = GenerateContentConfig()Note: Verify the appropriate copy mechanism for GenerateContentConfig - it may have a copy() method or require different handling.
🏁 Script executed:
#!/bin/bash
# Check if GenerateContentConfig has a copy method or how to properly copy it
ast-grep --pattern $'class GenerateContentConfig {
$$$
}'Length of output: 0
🏁 Script executed:
#!/bin/bash
# Find how GenerateContentConfig is imported and used throughout the codebase
rg -n 'GenerateContentConfig' --type=py -B2 -A2Length of output: 5758
🏁 Script executed:
#!/bin/bash
# Check if there are any copy, clone, or deepcopy patterns used in the codebase
rg -n 'from copy import|import copy|\.copy\(\)|deepcopy' --type=py | head -20Length of output: 898
🏁 Script executed:
#!/bin/bash
# Look for the __init__ method of the GeminiLLM class to see how base_config is stored
rg -n -A 15 'def __init__' plugins/gemini/vision_agents/plugins/gemini/gemini_llm.pyLength of output: 624
🌐 Web query:
google.genai GenerateContentConfig copy clone method
💡 Result:
Use the Pydantic copy API — GenerateContentConfig is a Pydantic model so call model_copy (or model_copy(deep=True) for a deep clone). Example (Python):
from google.genai import types
orig = types.GenerateContentConfig(temperature=0.5, candidate_count=2)
shallow = orig.model_copy()
deep = orig.model_copy(deep=True)
or with updates:
updated = orig.model_copy(update={"temperature": 0.1})
Sources: python-genai types docs and Pydantic model_copy documentation. [1][2]
Config mutation risk: Use model_copy() to avoid mutating shared config objects.
The method mutates base_config or self._base_config directly. If the same config object is passed or stored and _build_config() is called multiple times, the mutations accumulate on the same object. Related code in gemini_realtime.py already demonstrates copying patterns (lines 140, 316).
Since GenerateContentConfig is a Pydantic model, use model_copy() to create copies:
def _build_config(
self,
system_instruction: Optional[str] = None,
base_config: Optional[GenerateContentConfig] = None,
) -> GenerateContentConfig:
"""
Build GenerateContentConfig with Gemini 3 features.
Args:
system_instruction: Optional system instruction to include
base_config: Optional base config to extend (takes precedence over self._base_config)
Returns:
GenerateContentConfig with thinking_level and media_resolution if set
"""
if base_config is not None:
- config = base_config
+ config = base_config.model_copy()
elif self._base_config is not None:
- config = self._base_config
+ config = self._base_config.model_copy()
else:
config = GenerateContentConfig()📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def _build_config( | |
| self, | |
| system_instruction: Optional[str] = None, | |
| base_config: Optional[GenerateContentConfig] = None, | |
| ) -> GenerateContentConfig: | |
| """ | |
| Build GenerateContentConfig with Gemini 3 features. | |
| Args: | |
| system_instruction: Optional system instruction to include | |
| base_config: Optional base config to extend (takes precedence over self._base_config) | |
| Returns: | |
| GenerateContentConfig with thinking_level and media_resolution if set | |
| """ | |
| if base_config is not None: | |
| config = base_config | |
| elif self._base_config is not None: | |
| config = self._base_config | |
| else: | |
| config = GenerateContentConfig() | |
| if system_instruction: | |
| config.system_instruction = system_instruction | |
| if self.thinking_level: | |
| from google.genai.types import ThinkingConfig | |
| config.thinking_config = ThinkingConfig(thinking_level=self.thinking_level) | |
| if self.media_resolution: | |
| config.media_resolution = self.media_resolution | |
| return config | |
| def _build_config( | |
| self, | |
| system_instruction: Optional[str] = None, | |
| base_config: Optional[GenerateContentConfig] = None, | |
| ) -> GenerateContentConfig: | |
| """ | |
| Build GenerateContentConfig with Gemini 3 features. | |
| Args: | |
| system_instruction: Optional system instruction to include | |
| base_config: Optional base config to extend (takes precedence over self._base_config) | |
| Returns: | |
| GenerateContentConfig with thinking_level and media_resolution if set | |
| """ | |
| if base_config is not None: | |
| config = base_config.model_copy() | |
| elif self._base_config is not None: | |
| config = self._base_config.model_copy() | |
| else: | |
| config = GenerateContentConfig() | |
| if system_instruction: | |
| config.system_instruction = system_instruction | |
| if self.thinking_level: | |
| from google.genai.types import ThinkingConfig | |
| config.thinking_config = ThinkingConfig(thinking_level=self.thinking_level) | |
| if self.media_resolution: | |
| config.media_resolution = self.media_resolution | |
| return config |
Summary by CodeRabbit
New Features
Installation & Dependencies
Documentation
Updates