Skip to content

Conversation

@GauravJangra9988
Copy link
Contributor

@GauravJangra9988 GauravJangra9988 commented Oct 6, 2025

Description

Ollama model can setup using CLI instead of setting the environment variable

Type of Change

Changed the code to use Ollama Url from config.json

  • [ x] Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update
  • [x ] Code refactoring
  • [x ] Performance improvement
  • Other (please describe):

Related Issue

Fixes #(issue number)
#66

Changes Made

  • Change the logic to use Config.json for Ollama like other models are doing
  • Enhanced the options while setting up the Ollama model
image image

Testing

  • Tested with Gemini API
  • Tested with Grok API
  • [x ] Tested on Windows
  • Tested on Linux
  • Tested on macOS
  • Added/updated tests (if applicable)

Checklist

  • [x ] My code follows the project's code style
  • [x ] I have performed a self-review of my code
  • [x ] I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • [x ] My changes generate no new warnings or errors
  • [x ] I have tested this in a real Git repository
  • I have read the CONTRIBUTING.md guidelines

Screenshots (if applicable)

image

Additional Notes

Currently Ollama is only local model, so it is hard coded, when other models will be added, model selection can be made dynamic

For Hacktoberfest Participants

  • [ x] This PR is submitted as part of Hacktoberfest 2025

Thank you for your contribution! 🎉

Summary by CodeRabbit

  • New Features

    • Streamlined LLM setup: when selecting Ollama, you’re prompted for a URL; other providers still use an API key. Success messages now reflect URL vs API key updates.
  • Refactor

    • Ollama configuration no longer relies on environment variables for endpoint/model and uses a fixed default model for more consistent behavior.
  • Chores

    • Removed a stray debug print from configuration handling.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 6, 2025

Walkthrough

Updates the CLI’s Ollama integration: changes the GenerateCommitMessage call signature to accept apiKey and a fixed model, removes env-based URL/model usage, and revises LLM setup to prompt for URL when Ollama is selected and API key otherwise. Also removes a debug print in the store.

Changes

Cohort / File(s) Summary
Commit message generation (Ollama call signature)
cmd/cli/createMsg.go
Switches ollama.GenerateCommitMessage call to (config, changes, apiKey, model); stops reading OLLAMA_URL/OLLAMA_MODEL; uses fixed model "llama3:latest". Reflects exported signature change in ollama package.
LLM setup prompts and flow
cmd/cli/llmSetup.go
Adds branching: Ollama → prompt for URL; others → prompt for API key. Introduces separate option sets, adjusts messages (“URL Updated” vs “API Key Updated”), centralizes prompt creation.
Store cleanup
cmd/cli/store/store.go
Removes fmt.Println(len(data)) debug output in DefaultLLMKey; no functional changes.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor User
  participant CLI as CLI (createMsg)
  participant Store as Store
  participant Ollama as Ollama API

  rect rgba(200,230,255,0.3)
  note over CLI: New call signature
  User->>CLI: generate commit message
  CLI->>Store: read config + apiKey
  CLI->>Ollama: GenerateCommitMessage(config, changes, apiKey, "llama3:latest")
  Ollama-->>CLI: message / error
  CLI-->>User: output
  end
Loading
sequenceDiagram
  autonumber
  actor User
  participant CLI as CLI (llmSetup)
  participant Store as Store

  alt Model == Ollama
    User->>CLI: select Ollama
    CLI->>User: prompt for URL
    User-->>CLI: enter URL
    CLI->>Store: save URL
    note right of CLI: Success: URL Updated
  else Other models
    User->>CLI: select other model
    CLI->>User: prompt for API Key
    User-->>CLI: enter API Key
    CLI->>Store: save API Key
    note right of CLI: Success: API Key Updated
  end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Suggested labels

enhancement, go

Suggested reviewers

  • DFanso

Poem

A hare taps keys with whiskered grace,
Flips prompts for URLs in Ollama’s place.
No envs to chase, a model set—
Commit lines bloom, the cleanest yet.
hop hop saved a print—less noise, more pace! 🐇✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title “Ollama-CLI integration” succinctly captures the primary change of moving Ollama configuration from environment variables into the CLI setup, directly reflecting the pull request’s main objective without any superfluous wording.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (3)
cmd/cli/createMsg.go (2)

124-124: Consider making the Ollama model configurable.

The model is hard-coded to "llama3:latest". While the PR description acknowledges this as temporary, consider adding the model to the configuration to allow users to select different Ollama models without code changes.

If you'd like, I can propose a configuration schema change to support dynamic model selection for Ollama.


124-126: Clarify semantic overload: apiKey contains URL for Ollama.

The variable apiKey (from line 31) is used to pass the Ollama URL to GenerateCommitMessage. This creates confusion because:

  1. The variable name apiKey implies it contains an API key
  2. For Ollama, it actually contains the URL (set via the setup flow in llmSetup.go)
  3. The function parameter was renamed from url to apiKey, further obscuring intent

This semantic overload makes the code harder to understand and maintain. Consider one of these approaches:

Option 1: Rename the store field to be more generic (e.g., Credential or Config) and update variable names accordingly
Option 2: Add URL validation for Ollama to make the usage explicit

Example for Option 2 - Add validation before the switch statement:

// Validate Ollama URL format if using Ollama
if commitLLM == "Ollama" {
	if !strings.HasPrefix(apiKey, "http://") && !strings.HasPrefix(apiKey, "https://") {
		pterm.Error.Printf("Invalid Ollama URL format. Expected http:// or https:// prefix.\n")
		os.Exit(1)
	}
}

Don't forget to add the import:

import (
	"strings"
	// ... other imports
)
cmd/cli/llmSetup.go (1)

28-31: Consider extracting prompt creation to reduce duplication.

The apiKeyPrompt is defined in both SetupLLM and UpdateLLM with identical configuration. While this duplication is minor, extracting it to a helper function would improve maintainability.

Example helper function:

func createPrompt(label string, mask rune) promptui.Prompt {
    prompt := promptui.Prompt{
        Label: label,
    }
    if mask != 0 {
        prompt.Mask = mask
    }
    return prompt
}

Usage:

// In SetupLLM
apiKeyPrompt := createPrompt("Enter API Key", '*')

// In UpdateLLM
apiKeyPrompt := createPrompt("Enter API Key", '*')

Also applies to: 104-106

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 961cd50 and 8970a7f.

📒 Files selected for processing (3)
  • cmd/cli/createMsg.go (1 hunks)
  • cmd/cli/llmSetup.go (4 hunks)
  • cmd/cli/store/store.go (0 hunks)
💤 Files with no reviewable changes (1)
  • cmd/cli/store/store.go
🧰 Additional context used
🧬 Code graph analysis (1)
cmd/cli/llmSetup.go (1)
cmd/cli/store/store.go (1)
  • UpdateAPIKey (317-357)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Build Go Binary (windows-latest)
🔇 Additional comments (2)
cmd/cli/llmSetup.go (2)

99-119: LGTM! Clear option selection logic.

The code correctly adapts the prompt options and labels based on the selected model. The logic clearly distinguishes between Ollama (which needs URL configuration) and other models (which need API key configuration).


144-146: Good: Dynamic success message based on model type.

The success message correctly reflects whether an API key or URL was updated, improving user experience.

Copy link
Owner

@DFanso DFanso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🎉

@DFanso DFanso added bug Something isn't working enhancement New feature or request good first issue Good for newcomers hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution go Pull requests that update go code labels Oct 6, 2025
@DFanso DFanso linked an issue Oct 6, 2025 that may be closed by this pull request
@DFanso DFanso merged commit 2a2bf89 into DFanso:main Oct 6, 2025
8 checks passed
@GauravJangra9988 GauravJangra9988 deleted the ollama-cli branch October 9, 2025 15:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working enhancement New feature or request go Pull requests that update go code good first issue Good for newcomers hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Use CLI to add URL for Ollama model intead of env

2 participants