Skip to content

Conversation

@Muneer320
Copy link
Contributor

@Muneer320 Muneer320 commented Oct 5, 2025

Description

  • Add Groq as a free LLM provider option and wire it through the CLI
  • Document Groq environment variables and usage
  • Update the default Groq model after the 90B preview deprecation

Type of Change

  • New feature (non-breaking change which adds functionality)
  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update
  • Code refactoring
  • Performance improvement
  • Other (please describe):

Related Issue

Fixes #28

Changes Made

  • introduce internal/groq client targeting Groq’s OpenAI-compatible chat completions API
  • support COMMIT_LLM=groq in cmd/commit-msg/main.go with API key validation and optional overrides
  • add Groq usage notes to README.md and CONTRIBUTING.md
  • set the default Groq model to llama-3.3-70b-versatile with env-based overrides
  • cover the new client with httptest-based unit tests

Testing

  • Tested with Gemini API
  • Tested with Grok API
  • Tested on Windows
  • Tested on Linux
  • Tested on macOS
  • Added/updated tests (if applicable)

Checklist

  • My code follows the project's code style
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings or errors
  • I have tested this in a real Git repository
  • I have read the CONTRIBUTING.md guidelines

Screenshots (if applicable)

Additional Notes

  • Groq retired llama-3.2-90b-text-preview; the new default is llama-3.3-70b-versatile. Users can override via GROQ_MODEL.

For Hacktoberfest Participants

  • This PR is submitted as part of Hacktoberfest 2025

Summary by CodeRabbit

  • New Features

    • Added Groq as a supported LLM provider. Configure with COMMIT_LLM=groq and GROQ_API_KEY, with optional GROQ_MODEL and GROQ_API_URL. Clear validation error if API key is missing.
  • Documentation

    • Updated setup guides to include Groq, environment variable details, and provider descriptions.
    • Added Groq-specific setup steps and examples.
  • Tests

    • Added unit tests covering Groq request/response handling, error cases, and environment variable validation.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 5, 2025

Walkthrough

Adds Groq as an LLM provider: documentation updated, CLI gains a "groq" branch that reads GROQ_API_KEY and calls a new internal Groq client. Introduces internal Groq implementation using OpenAI-compatible chat API and corresponding unit tests with a mock HTTP server.

Changes

Cohort / File(s) Summary
Documentation updates
CONTRIBUTING.md, README.md
Mention Groq as an LLM option; add GROQ_API_KEY, GROQ_MODEL, GROQ_API_URL; include Groq setup instructions; formatting tweaks.
CLI provider routing
cmd/commit-msg/main.go
Adds "groq" handling in COMMIT_LLM switch: validates GROQ_API_KEY and calls groq.GenerateCommitMessage; imports internal/groq.
Groq client implementation
internal/groq/groq.go
New package implementing GenerateCommitMessage: builds prompt, selects model/env, posts to Groq API, parses chat response, returns first message or errors.
Groq integration tests
internal/groq/groq_test.go
Unit tests using test server: success path, non-200 handling, empty changes error; env isolation and cleanup helpers.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor User
  participant CLI as cmd/commit-msg
  participant Groq as internal/groq
  participant HTTP as Groq API

  User->>CLI: run commit message generation
  CLI->>CLI: Read COMMIT_LLM, GROQ_API_KEY, GROQ_MODEL, GROQ_API_URL
  alt COMMIT_LLM == "groq"
    CLI->>Groq: GenerateCommitMessage(config, changes, apiKey)
    Groq->>Groq: Build prompt and payload
    Groq->>HTTP: POST /chat/completions (Authorization: Bearer)
    HTTP-->>Groq: 200 JSON chat response
    Groq->>CLI: commit message (first choice)
  else other providers
    CLI->>CLI: Route to existing provider path
  end
  CLI-->>User: Generated commit message
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Suggested labels

enhancement, hacktoberfest, hacktoberfest-accepted, go

Suggested reviewers

  • DFanso

Poem

I twitched my ears at Groq’s new song,
A hop to chat where prompts belong.
With keys and models neatly set,
I thump the ground—no errors yet!
Commit notes bloom like clover dew,
Another nibble, merge it through. 🐇✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 33.33% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title “feat: add Groq as free LLM provider” clearly and concisely captures the primary change of integrating Groq as a new free LLM provider option without extraneous details, making it immediately understandable to reviewers.
Linked Issues Check ✅ Passed The pull request fully implements the objective of issue #28 by adding Groq as a free LLM provider in the CLI, updating documentation and environment variable examples, and providing client code and tests for Groq which directly satisfies the requirement to introduce a free/open LLM option.
Out of Scope Changes Check ✅ Passed All modifications in the pull request are focused on integrating the Groq LLM provider—including code, documentation, and tests—and there are no unrelated or extraneous changes beyond the scope of adding a free LLM option.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
README.md (1)

31-31: LGTM! Clear documentation of the new Groq provider.

The additions properly document Groq setup and usage, maintaining consistency with other providers.

Optional enhancement: Document additional environment variables in the table.

For completeness, consider adding GROQ_MODEL and GROQ_API_URL to the environment variables table (lines 44-51), as they're mentioned in the Groq setup section but not explicitly shown in the main table. Similarly, OLLAMA_URL and OLLAMA_MODEL could be added since "ollama" is supported in the code.

Example addition to the table:

 | `GROQ_API_KEY`   | Your API key                                     | Required if using Groq    |
+| `GROQ_MODEL`     | Model name (default: llama-3.3-70b-versatile)    | Optional override for Groq|
+| `GROQ_API_URL`   | Custom API endpoint                              | Optional override for Groq|

Also applies to: 44-51, 177-182

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 64baf24 and 2b056dd.

📒 Files selected for processing (5)
  • CONTRIBUTING.md (2 hunks)
  • README.md (3 hunks)
  • cmd/commit-msg/main.go (3 hunks)
  • internal/groq/groq.go (1 hunks)
  • internal/groq/groq_test.go (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (3)
internal/groq/groq.go (2)
pkg/types/types.go (1)
  • Config (4-7)
pkg/types/prompt.go (1)
  • CommitPrompt (3-19)
cmd/commit-msg/main.go (1)
internal/groq/groq.go (1)
  • GenerateCommitMessage (46-111)
internal/groq/groq_test.go (2)
internal/groq/groq.go (1)
  • GenerateCommitMessage (46-111)
pkg/types/types.go (1)
  • Config (4-7)
🔇 Additional comments (8)
CONTRIBUTING.md (1)

84-84: LGTM! Documentation updates are clear and complete.

The additions properly document Groq as a supported provider, maintaining consistency with the existing documentation structure for other LLM providers.

Also applies to: 93-93, 96-97

cmd/commit-msg/main.go (1)

14-14: LGTM! Groq integration follows existing patterns correctly.

The implementation properly validates the API key early and calls the Groq client using the same pattern as other providers. Error handling and function signatures are consistent with the established codebase conventions.

Also applies to: 44-48, 148-149

internal/groq/groq_test.go (3)

19-40: LGTM! Excellent test isolation using helper function.

The withTestServer helper properly isolates tests by:

  • Clearing environment variables with t.Setenv
  • Starting a mock HTTP server with automatic cleanup
  • Temporarily overriding global variables (baseURL, httpClient)
  • Restoring original state after tests

This prevents test interference and makes tests reliable.


42-86: LGTM! Comprehensive success path testing.

The test validates:

  • HTTP method (POST)
  • Authorization header format
  • Request payload structure (model, message count)
  • Response parsing and content extraction

This ensures the integration works correctly with the Groq API.


88-97: LGTM! Good coverage of error scenarios.

The tests properly validate:

  • Non-200 HTTP status code handling
  • Empty changes input validation

This ensures robust error handling in the client.

Also applies to: 99-106

internal/groq/groq.go (3)

15-33: LGTM! Clean type definitions for Groq API.

The structs properly model the OpenAI-compatible chat completions API, with appropriate JSON tags for marshaling/unmarshaling.


35-43: LGTM! Well-documented defaults with test override support.

The defaultModel constant documents the reason for the choice (Groq's retirement of the preview model), and package-level variables allow test overrides while maintaining a reasonable 30-second timeout.


45-111: LGTM! Robust implementation with comprehensive error handling.

The function demonstrates excellent practices:

  • Input validation (lines 47-49)
  • Environment-based configuration with sensible defaults (lines 53-56, 73-76)
  • Proper error wrapping with %w for error chain preservation
  • Status code validation with response body in error message (line 98)
  • Empty response validation (lines 106-108)
  • Resource cleanup with defer resp.Body.Close() (line 90)

The implementation follows Go best practices and is consistent with other LLM providers in the codebase.

Copy link
Owner

@DFanso DFanso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🎉

@DFanso DFanso merged commit 3dc1479 into DFanso:main Oct 5, 2025
8 checks passed
@DFanso DFanso added enhancement New feature or request hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution go Pull requests that update go code labels Oct 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request go Pull requests that update go code hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] Add Free LLM Providers

2 participants