-
Notifications
You must be signed in to change notification settings - Fork 19
feat: add Groq as free LLM provider #45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughAdds Groq as an LLM provider: documentation updated, CLI gains a "groq" branch that reads GROQ_API_KEY and calls a new internal Groq client. Introduces internal Groq implementation using OpenAI-compatible chat API and corresponding unit tests with a mock HTTP server. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor User
participant CLI as cmd/commit-msg
participant Groq as internal/groq
participant HTTP as Groq API
User->>CLI: run commit message generation
CLI->>CLI: Read COMMIT_LLM, GROQ_API_KEY, GROQ_MODEL, GROQ_API_URL
alt COMMIT_LLM == "groq"
CLI->>Groq: GenerateCommitMessage(config, changes, apiKey)
Groq->>Groq: Build prompt and payload
Groq->>HTTP: POST /chat/completions (Authorization: Bearer)
HTTP-->>Groq: 200 JSON chat response
Groq->>CLI: commit message (first choice)
else other providers
CLI->>CLI: Route to existing provider path
end
CLI-->>User: Generated commit message
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
README.md (1)
31-31: LGTM! Clear documentation of the new Groq provider.The additions properly document Groq setup and usage, maintaining consistency with other providers.
Optional enhancement: Document additional environment variables in the table.
For completeness, consider adding
GROQ_MODELandGROQ_API_URLto the environment variables table (lines 44-51), as they're mentioned in the Groq setup section but not explicitly shown in the main table. Similarly,OLLAMA_URLandOLLAMA_MODELcould be added since "ollama" is supported in the code.Example addition to the table:
| `GROQ_API_KEY` | Your API key | Required if using Groq | +| `GROQ_MODEL` | Model name (default: llama-3.3-70b-versatile) | Optional override for Groq| +| `GROQ_API_URL` | Custom API endpoint | Optional override for Groq|Also applies to: 44-51, 177-182
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
CONTRIBUTING.md(2 hunks)README.md(3 hunks)cmd/commit-msg/main.go(3 hunks)internal/groq/groq.go(1 hunks)internal/groq/groq_test.go(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (3)
internal/groq/groq.go (2)
pkg/types/types.go (1)
Config(4-7)pkg/types/prompt.go (1)
CommitPrompt(3-19)
cmd/commit-msg/main.go (1)
internal/groq/groq.go (1)
GenerateCommitMessage(46-111)
internal/groq/groq_test.go (2)
internal/groq/groq.go (1)
GenerateCommitMessage(46-111)pkg/types/types.go (1)
Config(4-7)
🔇 Additional comments (8)
CONTRIBUTING.md (1)
84-84: LGTM! Documentation updates are clear and complete.The additions properly document Groq as a supported provider, maintaining consistency with the existing documentation structure for other LLM providers.
Also applies to: 93-93, 96-97
cmd/commit-msg/main.go (1)
14-14: LGTM! Groq integration follows existing patterns correctly.The implementation properly validates the API key early and calls the Groq client using the same pattern as other providers. Error handling and function signatures are consistent with the established codebase conventions.
Also applies to: 44-48, 148-149
internal/groq/groq_test.go (3)
19-40: LGTM! Excellent test isolation using helper function.The
withTestServerhelper properly isolates tests by:
- Clearing environment variables with
t.Setenv- Starting a mock HTTP server with automatic cleanup
- Temporarily overriding global variables (baseURL, httpClient)
- Restoring original state after tests
This prevents test interference and makes tests reliable.
42-86: LGTM! Comprehensive success path testing.The test validates:
- HTTP method (POST)
- Authorization header format
- Request payload structure (model, message count)
- Response parsing and content extraction
This ensures the integration works correctly with the Groq API.
88-97: LGTM! Good coverage of error scenarios.The tests properly validate:
- Non-200 HTTP status code handling
- Empty changes input validation
This ensures robust error handling in the client.
Also applies to: 99-106
internal/groq/groq.go (3)
15-33: LGTM! Clean type definitions for Groq API.The structs properly model the OpenAI-compatible chat completions API, with appropriate JSON tags for marshaling/unmarshaling.
35-43: LGTM! Well-documented defaults with test override support.The defaultModel constant documents the reason for the choice (Groq's retirement of the preview model), and package-level variables allow test overrides while maintaining a reasonable 30-second timeout.
45-111: LGTM! Robust implementation with comprehensive error handling.The function demonstrates excellent practices:
- Input validation (lines 47-49)
- Environment-based configuration with sensible defaults (lines 53-56, 73-76)
- Proper error wrapping with
%wfor error chain preservation- Status code validation with response body in error message (line 98)
- Empty response validation (lines 106-108)
- Resource cleanup with
defer resp.Body.Close()(line 90)The implementation follows Go best practices and is consistent with other LLM providers in the codebase.
DFanso
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 🎉
Description
Type of Change
Related Issue
Fixes #28
Changes Made
internal/groqclient targeting Groq’s OpenAI-compatible chat completions APICOMMIT_LLM=groqincmd/commit-msg/main.gowith API key validation and optional overridesREADME.mdandCONTRIBUTING.mdllama-3.3-70b-versatilewith env-based overridesTesting
Checklist
Screenshots (if applicable)
Additional Notes
llama-3.2-90b-text-preview; the new default isllama-3.3-70b-versatile. Users can override viaGROQ_MODEL.For Hacktoberfest Participants
Summary by CodeRabbit
New Features
Documentation
Tests