-
Notifications
You must be signed in to change notification settings - Fork 432
Add Google Generative AI provider support #1724
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Add Google Gemini (google_generative_ai) as a selectable LLM provider so users can list and connect to Google models. Implement listGoogleModels to fetch and parse Google model metadata, register the provider in the shared LLM provider list, wire it into the model selection logic, and create provider connections using @ai-sdk/google in the LLM connection hook. This change enables discovery of Google models, handles generation capability checks and ignore rules, and integrates the provider into the existing UI and connection flow. v v
✅ Deploy Preview for hyprnote ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
|
Caution Review failedThe pull request is closed. 📝 WalkthroughWalkthroughAdds Google Generative AI (Google Gemini) as an LLM provider: provider config, model discovery module, common filter updates, wiring into provider selection and language-model connection, and provider-specific UI guidance. Changes
Sequence Diagram(s)sequenceDiagram
participant UI as Settings UI
participant Select as select.tsx
participant ListGoogle as list-google.ts
participant GoogleAPI as Google API
participant Hook as useLLMConnection
UI->>Select: choose provider google_generative_ai
Select->>ListGoogle: listGoogleModels(baseUrl, apiKey)
ListGoogle->>GoogleAPI: GET /models (x-goog-api-key)
GoogleAPI-->>ListGoogle: models + metadata
ListGoogle->>ListGoogle: validate schema, strip "models/" prefix, partition/filter
ListGoogle-->>Select: included / ignored model IDs
Select-->>UI: render available models
UI->>Hook: initialize provider google_generative_ai
Hook->>Hook: createGoogleGenerativeAI(fetch, baseURL, apiKey)
Hook-->>UI: provider instance (ready)
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes
Possibly related PRs
Pre-merge checks and finishing touches❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
📜 Recent review detailsConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
apps/desktop/src/components/settings/ai/shared/list-common.ts (1)
21-31: Keyword expansion affects all providers’ model filteringAdding
"computer"and"robotics"tocommonIgnoreKeywordswill now ignore any model id containing these substrings for every provider (not just Google) viashouldIgnoreCommonKeywords. That’s probably what you want for*-computer-use*/ robotics-style tool models, but if you hit false positives later, consider narrowing this (e.g. provider‑specific checks or more specific patterns).apps/desktop/src/components/settings/ai/shared/list-google.ts (1)
1-63: Google model listing implementation is sound; a couple of small nitsThis module cleanly follows the existing pattern: typed decode via
GoogleModelSchema, sharedpartition/shouldIgnoreCommonKeywords, and a defensive Effect pipeline with timeout and a DEFAULT_RESULT fallback. ThesupportsGeneration/getIgnoreReasonslogic should correctly filter out non‑generateContentand embedding/robotics models.Two small, non‑blocking nits:
- The
asynconlistGoogleModelsis redundant since you just returnEffect.runPromise(...); you could dropasyncor add anawaitfor clarity.- If you ever need more than the default page size from
/models, you may want to add pagination or apageSizequery param, but that’s probably overkill for a simple model picker.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
apps/desktop/src/components/settings/ai/llm/select.tsx(2 hunks)apps/desktop/src/components/settings/ai/llm/shared.tsx(1 hunks)apps/desktop/src/components/settings/ai/shared/list-common.ts(1 hunks)apps/desktop/src/components/settings/ai/shared/list-google.ts(1 hunks)apps/desktop/src/hooks/useLLMConnection.ts(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
apps/desktop/src/components/settings/ai/shared/list-google.ts (1)
apps/desktop/src/components/settings/ai/shared/list-common.ts (7)
ListModelsResult(13-16)DEFAULT_RESULT(18-18)ModelIgnoreReason(3-9)shouldIgnoreCommonKeywords(45-48)fetchJson(33-43)partition(50-66)REQUEST_TIMEOUT(19-19)
apps/desktop/src/components/settings/ai/llm/select.tsx (1)
apps/desktop/src/components/settings/ai/shared/list-google.ts (1)
listGoogleModels(26-63)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: ci (macos, macos-14)
- GitHub Check: fmt
🔇 Additional comments (3)
apps/desktop/src/components/settings/ai/llm/select.tsx (1)
18-18: Google provider is correctly wired into the model listing flowThe
listGoogleModelsimport and the"google_generative_ai"switch case match the patterns used for the other providers and correctly pass(baseUrl, apiKey). This should integrate cleanly withModelComboboxand the existinguseConfiguredMappingbehavior.Also applies to: 237-239
apps/desktop/src/hooks/useLLMConnection.ts (1)
2-2: Google Generative AI branch mirrors other providersThe
google_generative_aipath is consistent with Anthropic/OpenAI: you construct the provider viacreateGoogleGenerativeAI({ fetch, baseURL, apiKey })and then callgoogleProvider(conn.modelId)before wrapping with the thinking middleware, which keeps behavior aligned with the other backends. Please just double‑check this matches the expected factory signature and call style for your@ai-sdk/googleversion.Also applies to: 80-88
apps/desktop/src/components/settings/ai/llm/shared.tsx (1)
60-68: Google Gemini provider entry looks consistent with the rest of the configThe new
"google_generative_ai"provider definition (id, display name, icon,apiKey: true, andrequiresPro: false) lines up with how it’s used inuseLLMConnectionand the model selector. The default base URL points at thev1betaGenerative Language API; confirm that this matches the version your@ai-sdk/googlesetup expects for both listing (/models) and inference.
No description provided.