-
Notifications
You must be signed in to change notification settings - Fork 19
refactor: Move LLM provider strings to type-safe enum in pkg/types #73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughImplements an end-to-end commit message generation command using repo context, introduces a type-safe LLM provider enum, refactors CLI setup and store to use it, expands secret scrubbing patterns, and makes minor formatting changes across providers and git operations. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor User
participant CLI as CLI createmsg
participant Git as Git Ops
participant Scrub as Scrubber
participant LLM as LLM Provider (OpenAI/Claude/Gemini/Groq/Grok/Ollama)
participant CB as Clipboard
participant Term as Terminal
User->>CLI: Run command
CLI->>Git: Validate repo, collect changes/stats
Git-->>CLI: Diffs + untracked text
CLI->>Scrub: Scrub sensitive data
Scrub-->>CLI: Redacted context
CLI->>CLI: Select provider via types.ParseLLMProvider
CLI->>LLM: GenerateCommitMessage(context, config)
note over LLM,CLI: Show spinner while waiting
LLM-->>CLI: Commit message or error
alt Error
CLI->>Term: Print error details
else Success
CLI->>CB: Copy message
CLI->>Term: Show message and change preview
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (2 warnings)
✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
internal/scrubber/scrubber.go (1)
30-33: Fix redaction format for authorization header.The replacement unconditionally appends a double quote, so inputs without quotes (or with single quotes) now end up with stray characters like
Authorization=[REDACTED_AUTH_TOKEN]". Please capture the trailing quote in the regex and reuse it in the replacement to keep formatting intact.- { - Name: "Authorization Header", - Pattern: regexp.MustCompile(`(?i)(authorization\s*[=:]\s*["\']?)([a-zA-Z0-9_\-\.]{20,})["\']?`), - Redact: "${1}[REDACTED_AUTH_TOKEN]\"", - }, + { + Name: "Authorization Header", + Pattern: regexp.MustCompile(`(?i)(authorization\s*[=:]\s*)(["\']?)([a-zA-Z0-9_\-\.]{20,})(["\']?)`), + Redact: "${1}${2}[REDACTED_AUTH_TOKEN]${4}", + },
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (11)
cmd/cli/createMsg.go(2 hunks)cmd/cli/llmSetup.go(2 hunks)cmd/cli/root.go(1 hunks)cmd/cli/store/store.go(12 hunks)internal/chatgpt/chatgpt.go(3 hunks)internal/claude/claude.go(1 hunks)internal/git/operations.go(2 hunks)internal/ollama/ollama.go(3 hunks)internal/scrubber/scrubber.go(8 hunks)internal/scrubber/scrubber_test.go(9 hunks)pkg/types/types.go(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (7)
internal/scrubber/scrubber_test.go (1)
internal/scrubber/scrubber.go (2)
ScrubDiff(143-152)GetDetectedPatterns(182-190)
pkg/types/types.go (1)
cmd/cli/store/store.go (1)
LLMProvider(14-17)
internal/ollama/ollama.go (3)
internal/chatgpt/chatgpt.go (1)
GenerateCommitMessage(13-32)internal/claude/claude.go (1)
GenerateCommitMessage(33-85)pkg/types/types.go (1)
Config(53-56)
cmd/cli/llmSetup.go (2)
pkg/types/types.go (3)
GetSupportedProviderStrings(38-45)ParseLLMProvider(47-50)ProviderOllama(11-11)cmd/cli/store/store.go (4)
ListSavedModels(182-212)ChangeDefault(214-248)UpdateAPIKey(303-342)DeleteModel(250-301)
internal/git/operations.go (2)
internal/utils/utils.go (2)
IsTextFile(19-35)IsSmallFile(38-47)internal/scrubber/scrubber.go (1)
ScrubEnvFile(193-231)
cmd/cli/store/store.go (1)
pkg/types/types.go (1)
LLMProvider(3-3)
cmd/cli/createMsg.go (8)
cmd/cli/store/store.go (2)
DefaultLLMKey(142-180)Config(19-22)internal/git/operations.go (2)
IsRepository(16-23)GetChanges(26-130)pkg/types/types.go (8)
Config(53-56)RepoConfig(59-62)ProviderGemini(8-8)ProviderOpenAI(6-6)ProviderClaude(7-7)ProviderGroq(10-10)ProviderOllama(11-11)ProviderGrok(9-9)internal/stats/statistics.go (1)
GetFileStatistics(14-74)internal/display/display.go (3)
ShowFileStatistics(20-96)ShowCommitMessage(99-115)ShowChangesPreview(118-133)internal/chatgpt/chatgpt.go (1)
GenerateCommitMessage(13-32)internal/claude/claude.go (1)
GenerateCommitMessage(33-85)internal/ollama/ollama.go (1)
GenerateCommitMessage(23-74)
| apiKeyPrompt := promptui.Prompt{ | ||
| Label: "Enter API Key", | ||
| } | ||
| } | ||
|
|
||
|
|
||
| if model == "Ollama" { | ||
| prompt = promptui.Select{ | ||
| if model == types.ProviderOllama.String() { | ||
| prompt = promptui.Select{ | ||
| Label: "Select Option", | ||
| Items: options2, | ||
| } | ||
|
|
||
| apiKeyPrompt = promptui.Prompt { | ||
| Label: "Enter URL", | ||
| } | ||
| Items: options2, | ||
| } | ||
|
|
||
| apiKeyPrompt = promptui.Prompt{ | ||
| Label: "Enter URL", | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Restore masked input for API keys.
Line 105 now collects API keys without a mask, so secrets are echoed back on screen. That’s a regression from the previous masked prompt and leaks credentials over the shoulder or into screen recordings. Please add Mask: '*' (while leaving the Ollama URL prompt unmasked).
- apiKeyPrompt := promptui.Prompt{
- Label: "Enter API Key",
- }
+ apiKeyPrompt := promptui.Prompt{
+ Label: "Enter API Key",
+ Mask: '*',
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| apiKeyPrompt := promptui.Prompt{ | |
| Label: "Enter API Key", | |
| } | |
| } | |
| if model == "Ollama" { | |
| prompt = promptui.Select{ | |
| if model == types.ProviderOllama.String() { | |
| prompt = promptui.Select{ | |
| Label: "Select Option", | |
| Items: options2, | |
| } | |
| apiKeyPrompt = promptui.Prompt { | |
| Label: "Enter URL", | |
| } | |
| Items: options2, | |
| } | |
| apiKeyPrompt = promptui.Prompt{ | |
| Label: "Enter URL", | |
| } | |
| apiKeyPrompt := promptui.Prompt{ | |
| Label: "Enter API Key", | |
| Mask: '*', | |
| } | |
| if model == types.ProviderOllama.String() { | |
| prompt = promptui.Select{ | |
| Label: "Select Option", | |
| Items: options2, | |
| } | |
| apiKeyPrompt = promptui.Prompt{ | |
| Label: "Enter URL", | |
| } | |
| } |
🤖 Prompt for AI Agents
In cmd/cli/llmSetup.go around lines 105 to 117, the API key prompt no longer
masks input so secrets are echoed; restore masking by setting Mask: '*' on the
apiKeyPrompt definition (keep the Ollama URL prompt unmasked for the model ==
ProviderOllama branch). Update the initial promptui.Prompt and the alternative
assignment inside the Ollama conditional so the API key prompt includes Mask:
'*' while leaving the URL prompt unchanged.
DFanso
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 🎉
Description
This pull request refactors the handling of LLM provider types across the CLI codebase, replacing string-based provider identification with a strongly-typed approach using the new
LLMProvidertype inpkg/types.Type of Change
Related Issue
Fixes #59
Changes Made
LLMProvidertype inpkg/types/types.gowith constants for each provider, helper methods for validation, string conversion, and parsing, plus functions to list supported providers.cmd/cli/llmSetup.goto use the newLLMProvidertype and helper functions, replacing raw string usage.gofmt -w .to format the code.Testing
Checklist
Screenshots (if applicable)
Additional Notes
For Hacktoberfest Participants
Thank you for your contribution! 🎉
Summary by CodeRabbit
New Features
Improvements
Bug Fixes