Skip to content

Conversation

@yujonglee
Copy link
Contributor

No description provided.

@coderabbitai
Copy link

coderabbitai bot commented Oct 21, 2025

Caution

Review failed

The pull request is closed.

📝 Walkthrough

Walkthrough

Refactors provider definitions from object-based baseUrl{value,immutable} to plain strings, introduces a generic "custom" provider, and converts ProviderCard and related UI to drive behavior from a single config prop; adjusts form rendering, validation, and provider-context messages accordingly.

Changes

Cohort / File(s) Summary
LLM Provider Configuration
apps/desktop/src/components/settings/ai/llm/shared.tsx, apps/desktop/src/components/settings/ai/llm/configure.tsx
Convert provider baseUrl from { value, immutable } to string; add generic custom provider; refactor ProviderCard to accept a single config prop and drive UI/logic from config.* (id, icon, displayName, badge, baseUrl, apiKey).
STT Provider Configuration
apps/desktop/src/components/settings/ai/stt/shared.tsx, apps/desktop/src/components/settings/ai/stt/configure.tsx
Convert provider baseUrl from { value, immutable } to string or undefined; replace provider-specific custom entry (e.g., deepgram-custom) with generic custom; adjust NonHyprProviderCard defaults and conditional rendering of base_url and Advanced section; update Deepgram-related guidance text.
Shared Form Field
apps/desktop/src/components/settings/ai/shared/index.tsx
Remove hidden prop from FormField; simplify error handling (drop isDirty, derive hasError from isTouched + errors); expand errorMessage extraction to handle object-shaped errors.
LLM Connection Hook
apps/desktop/src/hooks/useLLMConnection.ts
Read provider base URL directly from providerDefinition?.baseUrl (string) instead of providerDefinition?.baseUrl.value.
Store Schema
apps/desktop/src/store/tinybase/internal.ts
Relax aiProviderSchema.api_key from z.string().min(1) to z.string() and add a refine that requires api_key when base_url starts with https: with the message "API key is required for HTTPS URLs".

Sequence Diagram(s)

sequenceDiagram
  actor User
  participant UI as ProviderCard (config)
  participant Form as FormState
  participant Context as ProviderContext
  participant Store as aiProviderSchema

  Note right of UI #eef2ff: ProviderCard driven by `config` (id, icon, displayName, baseUrl, apiKey)

  User->>UI: open provider panel
  UI->>Form: set defaults (base_url = config.baseUrl ?? "", api_key = existing or "")
  alt config.baseUrl truthy
    UI->>UI: show Advanced section with editable base_url
  else config.baseUrl falsy
    UI->>UI: show base_url input in main form
  end
  User->>Form: edit fields
  Form->>Context: update selected provider id = config.id
  Form->>Store: validate (aiProviderSchema)
  Store-->>Form: validation result (api_key required if base_url startsWith https:)
  Form-->>UI: show errors / submit
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~30–45 minutes

Possibly related PRs

Pre-merge checks and finishing touches

❌ Failed checks (1 warning, 2 inconclusive)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
Title Check ❓ Inconclusive The title "AI setting custom" is vague and lacks specificity about the primary changes. While it does reference "custom" which aligns with the addition of custom provider support in the changeset, the phrasing is unclear and grammatically awkward. The title does not effectively convey the main architectural changes, which involve refactoring provider configuration to a config-driven approach and restructuring baseUrl fields from objects to plain strings across multiple provider definitions. A reader scanning the history would struggle to understand the scope and nature of these modifications from this title alone. Consider revising the title to be more specific and descriptive. A better title might be something like "Refactor AI provider configuration to use config-driven approach" or "Add custom provider support and simplify provider baseUrl structure" to clearly communicate the primary changes to reviewers scanning pull request history.
Description Check ❓ Inconclusive No pull request description was provided by the author. While the check is intentionally lenient and will pass as long as a description is related to the changeset, the complete absence of any description means there is no content to evaluate against the pass criterion of being "related in some way to the changeset." This is insufficient information to make a conclusive determination about whether the check passes or fails. Add a pull request description that explains the rationale and scope of changes. Even a brief description covering the main objectives (config-driven provider refactoring, baseUrl structure simplification, and custom provider support) would provide valuable context for reviewers and help document the intent of these medium-effort changes across multiple files.

📜 Recent review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 634b0c7 and ccc0dba.

📒 Files selected for processing (2)
  • apps/desktop/src/components/settings/ai/shared/index.tsx (1 hunks)
  • apps/desktop/src/store/tinybase/internal.ts (1 hunks)

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2526a87 and 634b0c7.

📒 Files selected for processing (7)
  • apps/desktop/src/components/settings/ai/llm/configure.tsx (3 hunks)
  • apps/desktop/src/components/settings/ai/llm/shared.tsx (2 hunks)
  • apps/desktop/src/components/settings/ai/shared/index.tsx (1 hunks)
  • apps/desktop/src/components/settings/ai/stt/configure.tsx (4 hunks)
  • apps/desktop/src/components/settings/ai/stt/shared.tsx (2 hunks)
  • apps/desktop/src/hooks/useLLMConnection.ts (1 hunks)
  • apps/desktop/src/store/tinybase/internal.ts (1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
apps/desktop/**/*.{tsx,jsx}

📄 CodeRabbit inference engine (apps/desktop/.cursor/rules/style.mdc)

apps/desktop/**/*.{tsx,jsx}: When there are many Tailwind classNames with conditional logic, use the utility cn imported as import { cn } from "@hypr/utils"
When using cn for Tailwind classNames, always pass an array
Group Tailwind classNames by logical sections when using cn (split array items by grouping)

Files:

  • apps/desktop/src/components/settings/ai/shared/index.tsx
  • apps/desktop/src/components/settings/ai/llm/configure.tsx
  • apps/desktop/src/components/settings/ai/stt/configure.tsx
  • apps/desktop/src/components/settings/ai/stt/shared.tsx
  • apps/desktop/src/components/settings/ai/llm/shared.tsx
🧬 Code graph analysis (3)
apps/desktop/src/components/settings/ai/llm/configure.tsx (2)
apps/desktop/src/components/settings/ai/llm/shared.tsx (1)
  • PROVIDERS (6-63)
apps/desktop/src/components/settings/ai/shared/index.tsx (2)
  • useProvider (9-21)
  • FormField (23-68)
apps/desktop/src/components/settings/ai/stt/configure.tsx (2)
crates/transcribe-deepgram/src/service.rs (1)
  • config (34-37)
apps/desktop/src/components/settings/ai/shared/index.tsx (1)
  • FormField (23-68)
apps/desktop/src/components/settings/ai/stt/shared.tsx (1)
plugins/local-stt/js/bindings.gen.ts (1)
  • SupportedSttModel (98-98)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: zizmor
  • GitHub Check: ci (macos, macos-14)
🔇 Additional comments (15)
apps/desktop/src/hooks/useLLMConnection.ts (1)

56-56: LGTM!

The change correctly updates the baseUrl retrieval to match the new string-based provider schema. The fallback logic properly handles the new "custom" provider (where baseUrl is undefined) by falling back to the user-provided base_url from config or an empty string, which is then validated on line 59-61.

apps/desktop/src/components/settings/ai/shared/index.tsx (1)

43-43: LGTM!

The removal of the hidden prop simplifies the component. Visibility control is now handled at a higher level through conditional rendering (e.g., {!config.baseUrl && <form.Field...>} in the parent components), which is a cleaner approach.

apps/desktop/src/components/settings/ai/llm/shared.tsx (3)

1-1: LGTM!

The Icon import is added to support the new "custom" provider's icon rendering.


13-53: LGTM!

The migration from object-based baseUrl structure ({ value, immutable }) to plain string is consistent across all providers and aligns with the PR's goal of simplifying provider configuration.


55-62: LGTM!

The new "custom" provider entry is well-structured with:

  • baseUrl: undefined to require user input
  • apiKey: true to indicate API key requirement
  • Appropriate icon and display name

This enables users to configure custom OpenAI-compatible endpoints.

apps/desktop/src/components/settings/ai/stt/configure.tsx (3)

55-55: LGTM!

The default value now properly falls back to config.baseUrl, which provides sensible defaults for predefined providers while allowing the "custom" provider to start with an empty string.


99-138: LGTM! Well-designed conditional UI pattern.

The conditional rendering creates two distinct UX flows:

  • Custom provider (!config.baseUrl): Base URL field shown upfront (required input)
  • Predefined providers (config.baseUrl): Base URL pre-configured, editable in Advanced section (optional override)

This is intuitive and follows good UX principles.

As per coding guidelines, the cn usage properly passes an array with logical grouping.


391-396: No issues found — messaging is intentionally different based on backend API requirements.

The STT and LLM custom provider messages reflect accurate, distinct backend implementations:

  • STT (Deepgram): Uses the deepgram::Deepgram SDK, which requires "Deepgram compatible" endpoints
  • LLM (OpenAI): Uses the async_openai SDK with custom api_base support, which requires "OpenAI compatible" endpoints

Both backends support custom base URLs through their respective SDKs. The messaging differences are intentional and correct—not inconsistencies requiring updates.

Likely an incorrect or invalid review comment.

apps/desktop/src/components/settings/ai/llm/configure.tsx (5)

19-28: LGTM!

The refactoring to a single config prop improves maintainability and aligns with the config-driven provider model. This reduces prop drilling and makes the component more cohesive.


35-35: LGTM!

The default base_url properly uses config.baseUrl ?? "", which provides sensible defaults for predefined providers while allowing the "custom" provider to start with an empty string.


60-66: LGTM!

The UI elements now correctly derive from the config object properties (icon, displayName, badge), maintaining consistency with the config-driven approach.

As per coding guidelines, the cn usage properly passes an array.


78-119: LGTM! Excellent conditional rendering logic.

The multi-tiered conditional rendering is well-designed:

  1. Base URL field (lines 78-88): Shown upfront when !config.baseUrl (custom provider)
  2. API Key field (lines 89-101): Shown only when config.apiKey is true (excludes Ollama, LM Studio, Hyprnote)
  3. Advanced section (lines 102-119): Shown when config.baseUrl exists, allowing users to override predefined Base URLs

This creates an intuitive UX that adapts to each provider's requirements.

As per coding guidelines, the cn usage is not applicable here as there are no conditional Tailwind classes.


129-130: LGTM!

The custom provider context message appropriately indicates "OpenAI compatible" endpoint support, which is accurate for LLM providers.

apps/desktop/src/components/settings/ai/stt/shared.tsx (2)

1-1: LGTM!

The Icon import is added to support the new "custom" provider's icon rendering.


30-65: LGTM!

The changes are consistent with the LLM provider refactoring:

  • baseUrl migrated from object structure to plain string across all providers
  • New "custom" provider added with baseUrl: undefined and empty models array

The empty models array for the custom provider is appropriate since custom endpoints would have their own model lists.

@yujonglee yujonglee merged commit ba854b9 into main Oct 21, 2025
3 of 4 checks passed
@yujonglee yujonglee deleted the ai-setting-custom branch October 21, 2025 04:52
This was referenced Nov 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants