Skip to content

Conversation

@Muneer320
Copy link
Contributor

@Muneer320 Muneer320 commented Oct 8, 2025

Description

This PR introduces a reusable LLM provider interface with factory-based construction and refactors the commit message workflow to depend on it. It consolidates provider-specific logic, standardizes credential handling, and improves error messages so new providers can be added with minimal wiring.

Type of Change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update
  • Code refactoring
  • Performance improvement
  • Other (please describe):

Related Issue

Fixes #9

Changes Made

  • Introduced internal/llm with a provider interface, factory registry, and concrete adapters for each existing LLM backend.
  • Refactored CreateCommitMsg to instantiate providers through the new abstraction, reuse them across regenerations, and surface clearer credential guidance.
  • Added unit tests covering credential fallbacks, Ollama defaults, and factory overrides while preserving Groq test hooks.

Testing

  • Tested with Gemini API
  • Tested with Grok API
  • Tested on Windows
  • Tested on Linux
  • Tested on macOS
  • Added/updated tests (if applicable)

Checklist

  • My code follows the project's code style
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings or errors
  • I have tested this in a real Git repository
  • I have read the CONTRIBUTING.md guidelines

Screenshots (if applicable)

N/A

Additional Notes

  • Providers can now be registered via llm.RegisterFactory, enabling easy experimentation with new backends without modifying the CLI.

For Hacktoberfest Participants

  • This PR is submitted as part of Hacktoberfest 2025

Summary by CodeRabbit

  • New Features

    • Pluggable LLM provider system with support for OpenAI, Claude, Gemini, Grok, Groq, and Ollama.
    • Clear, provider-specific guidance when credentials are missing.
    • Automatic use of environment variables for credentials where available.
    • Sensible defaults and setup hints for Ollama (URL/model), including dry-run support.
  • Refactor

    • Unified message generation via a single provider interface for more consistent behavior across providers.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 8, 2025

Walkthrough

Replaces per-provider commit message generation with a unified llm.Provider interface and factory. CLI now creates providers via llm.NewProvider and calls provider.Generate with context. Adds standardized credential handling with ErrMissingCredential and provider-specific hints. Introduces provider registry and concrete implementations plus unit tests for factory, env fallback, overrides, and defaults.

Changes

Cohort / File(s) Summary
CLI refactor & error handling
cmd/cli/createMsg.go
Switches to context-aware provider-based generation (generateMessage(ctx, provider,...)). Initializes providers via llm.NewProvider. Adds displayMissingCredentialHint and expands displayProviderError to handle llm.ErrMissingCredential and Ollama-specific guidance. Updates regeneration/dry-run paths accordingly.
LLM provider framework
internal/llm/provider.go
Adds Provider interface, ProviderOptions, Factory, registry (RegisterFactory/NewProvider), and ErrMissingCredential. Implements factory-backed providers for OpenAI, Claude, Gemini, Grok, Groq, and Ollama with credential validation, env fallbacks, and unified Generate API.
Provider tests
internal/llm/provider_test.go
Adds tests for missing credentials, env fallback, unsupported provider, Ollama defaults, and factory override using a fake provider. Restores original factories after override tests.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor User
  participant CLI as CLI (createMsg)
  participant LLM as llm.NewProvider
  participant Prov as Provider
  participant Backend as LLM Backend

  User->>CLI: request commit message
  CLI->>LLM: NewProvider(name, options)
  LLM-->>CLI: Provider instance or error
  alt Provider created
    CLI->>Prov: Generate(ctx, changes, opts)
    Prov->>Backend: GenerateCommitMessage(...)
    Backend-->>Prov: message or error
    Prov-->>CLI: message or error
    alt Success
      CLI-->>User: display message
    else Error
      CLI->>CLI: displayProviderError(...)
      CLI-->>User: error + hint
    end
  else Missing credential
    CLI->>CLI: displayMissingCredentialHint(name)
    CLI-->>User: credential guidance
  end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Possibly related PRs

Suggested labels

enhancement, hacktoberfest, hacktoberfest-accepted, go

Suggested reviewers

  • DFanso

Poem

I thump my paw—new tunnels bloom,
One door for all, no branching gloom.
Keys in pockets, hints in light,
Factories forge the path just right.
Providers hop in tidy rows—
Commit tales sprout where logic flows. 🥕🐇

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 18.18% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The pull request title accurately and concisely describes the introduction of a unified LLM provider interface and the associated refactoring of the CLI integration, directly reflecting the main change in the changeset.
Linked Issues Check ✅ Passed The changes fully implement the objectives of issue #9 by defining a common Provider interface, replacing repetitive provider selection logic with a factory registry, refactoring the CLI to use the new abstraction, enhancing credential handling, and adding tests to enable mocking, thereby satisfying the coding-related requirements for extensibility and maintainability.
Out of Scope Changes Check ✅ Passed All modifications pertain to the new LLM provider abstraction, factory registration, CLI refactoring, and related tests, with no unrelated or extraneous code changes introduced outside the scope of issue #9.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2c50506 and 87bd301.

📒 Files selected for processing (3)
  • cmd/cli/createMsg.go (8 hunks)
  • internal/llm/provider.go (1 hunks)
  • internal/llm/provider_test.go (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (3)
internal/llm/provider.go (2)
pkg/types/options.go (1)
  • GenerationOptions (4-10)
pkg/types/types.go (6)
  • ProviderOpenAI (8-8)
  • ProviderClaude (9-9)
  • ProviderGemini (10-10)
  • ProviderGrok (11-11)
  • ProviderGroq (12-12)
  • ProviderOllama (13-13)
internal/llm/provider_test.go (2)
pkg/types/types.go (6)
  • ProviderOpenAI (8-8)
  • ProviderClaude (9-9)
  • ProviderGemini (10-10)
  • ProviderGrok (11-11)
  • ProviderGroq (12-12)
  • ProviderOllama (13-13)
internal/llm/provider.go (5)
  • NewProvider (60-70)
  • ProviderOptions (32-35)
  • ErrMissingCredential (21-21)
  • RegisterFactory (53-57)
  • Provider (24-29)
cmd/cli/createMsg.go (3)
internal/llm/provider.go (4)
  • NewProvider (60-70)
  • ProviderOptions (32-35)
  • Provider (24-29)
  • ErrMissingCredential (21-21)
pkg/types/types.go (8)
  • Config (60-63)
  • ProviderOllama (13-13)
  • LLMProvider (5-5)
  • ProviderGemini (10-10)
  • ProviderOpenAI (8-8)
  • ProviderClaude (9-9)
  • ProviderGroq (12-12)
  • ProviderGrok (11-11)
pkg/types/options.go (1)
  • GenerationOptions (4-10)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Package Binaries
🔇 Additional comments (1)
internal/llm/provider.go (1)

68-69: Nice defensive defaulting.

Ensuring a non-nil config before invoking factories keeps the provider surface resilient.


Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Owner

@DFanso DFanso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🎉

@DFanso DFanso added enhancement New feature or request hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution go Pull requests that update go code labels Oct 9, 2025
@DFanso DFanso merged commit 9ff358b into DFanso:main Oct 9, 2025
8 checks passed
@coderabbitai coderabbitai bot mentioned this pull request Oct 9, 2025
21 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request go Pull requests that update go code hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] Create LLM Interface for Better Code Structure and Extensibility

2 participants