-
Notifications
You must be signed in to change notification settings - Fork 418
Model select hotfix #1185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model select hotfix #1185
Conversation
📝 WalkthroughWalkthroughThe changes expand the list of available speech-to-text models with new quantized variants, enhance the frontend's synchronization with backend model selection for both STT and LLM models, and introduce polling to keep STT model download statuses updated. Localization files are updated with new source code line references, reflecting code shifts. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant UI (LocalAI)
participant Backend
User->>UI (LocalAI): Selects STT or LLM model
alt STT model selected and downloaded
UI (LocalAI)->>Backend: setCurrentSTTModel(selectedModel)
end
alt LLM model selected and custom LLM disabled
UI (LocalAI)->>Backend: setCurrentLLMModel(selectedModel)
UI (LocalAI)->>Backend: disableCustomLLM()
end
loop Every 3 seconds
UI (LocalAI)->>Backend: fetchSTTModelDownloadStatus()
Backend-->>UI (LocalAI): Returns download status for all STT models
UI (LocalAI)->>UI (LocalAI): Update local STT model status
end
UI (LocalAI)->>Backend: fetchCurrentLLMModel()
Backend-->>UI (LocalAI): Returns current LLM model
UI (LocalAI)->>Backend: fetchCurrentSTTModel()
Backend-->>UI (LocalAI): Returns current STT model
UI (LocalAI)->>UI (LocalAI): Sync local state with backend
Estimated code review effort3 (90–240 minutes) ✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (1)
apps/desktop/src/components/settings/views/ai.tsx (1)
418-441: Consider deriving model keys dynamically to improve maintainabilityThe model keys are hard-coded in the status check, which could lead to maintenance issues if models are added or removed.
- queryFn: async () => { - const statusChecks = await Promise.all([ - localSttCommands.isModelDownloaded("QuantizedTiny"), - localSttCommands.isModelDownloaded("QuantizedTinyEn"), - localSttCommands.isModelDownloaded("QuantizedBase"), - localSttCommands.isModelDownloaded("QuantizedBaseEn"), - localSttCommands.isModelDownloaded("QuantizedSmall"), - localSttCommands.isModelDownloaded("QuantizedSmallEn"), - localSttCommands.isModelDownloaded("QuantizedLargeTurbo"), - ]); - return { - "QuantizedTiny": statusChecks[0], - "QuantizedTinyEn": statusChecks[1], - "QuantizedBase": statusChecks[2], - "QuantizedBaseEn": statusChecks[3], - "QuantizedSmall": statusChecks[4], - "QuantizedSmallEn": statusChecks[5], - "QuantizedLargeTurbo": statusChecks[6], - } as Record<string, boolean>; + queryFn: async () => { + const statusPromises = sttModels.map(model => + localSttCommands.isModelDownloaded(model.key as any) + ); + const statusChecks = await Promise.all(statusPromises); + + return sttModels.reduce((acc, model, index) => { + acc[model.key] = statusChecks[index]; + return acc; + }, {} as Record<string, boolean>);
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
apps/desktop/src/components/settings/views/ai.tsx(5 hunks)apps/desktop/src/locales/en/messages.po(11 hunks)apps/desktop/src/locales/ko/messages.po(11 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.{js,ts,tsx,rs}
Instructions used from:
Sources:
⚙️ CodeRabbit Configuration File
🧬 Code Graph Analysis (1)
apps/desktop/src/components/settings/views/ai.tsx (2)
plugins/local-llm/js/bindings.gen.ts (1)
SupportedModel(55-55)plugins/local-stt/js/bindings.gen.ts (1)
SupportedModel(69-69)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: ci (windows, windows-latest)
- GitHub Check: ci (macos, macos-latest)
🔇 Additional comments (1)
apps/desktop/src/locales/en/messages.po (1)
349-1327: LGTM!The line reference updates correctly reflect the shifts in the source file positions.
| onClick={() => { | ||
| if (model.downloaded) { | ||
| setSelectedSTTModel(model.key); | ||
| localSttCommands.setCurrentModel(model.key as any); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve type safety by avoiding type assertions
The use of as any and as SupportedModel type assertions could hide type mismatches at compile time.
Consider updating the model type definitions to properly type the model keys, eliminating the need for type assertions. This would provide better compile-time type safety.
Also applies to: 627-627
🤖 Prompt for AI Agents
In apps/desktop/src/components/settings/views/ai.tsx at lines 489 and 627, the
code uses 'as any' and 'as SupportedModel' type assertions on model keys, which
reduces type safety. To fix this, update the type definitions for the model keys
so they correctly reflect the expected types without needing assertions. Adjust
the model and localSttCommands types to align, ensuring the model.key is typed
properly and can be passed directly to setCurrentModel without casting.
|
I would suggest makes things more type safe, but we'll rewrite setting/ai tab anyway :) |
No description provided.