Skip to content

xavimf87/commit-msg-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

commit-msg-ai

Generate commit messages from your staged changes using a local LLM via Ollama. No API keys, no cloud — everything runs on your machine.

Getting started

1. Install and set up Ollama

commit-msg-ai requires Ollama to run language models locally. Install it first:

macOS:

brew install ollama

Linux:

curl -fsSL https://ollama.com/install.sh | sh

Windows: Download the installer from ollama.com/download.

Once installed, start the Ollama server:

ollama serve

On macOS, Ollama runs automatically in the background after installation. You can skip this step if you see the Ollama icon in your menu bar.

2. Choose a model

You need at least one model downloaded. See what's available on your machine:

ollama list

If the list is empty, pull a model. Some good options for commit message generation:

# Lightweight and fast (~2GB)
ollama pull llama3.2

# Good for code understanding (~4.7GB)
ollama pull qwen2.5-coder

# Small and capable (~2.3GB)
ollama pull mistral

You can browse all available models at ollama.com/library.

3. Install commit-msg-ai

pip install commit-msg-ai

4. Configure your model

By default commit-msg-ai uses llama3.2. If you pulled a different model, set it as default:

commit-msg-ai config model qwen2.5-coder

Verify your config:

commit-msg-ai config

5. Use it

git add .
commit-msg-ai
Staged files:
M  src/auth.py
A  src/middleware.py

Generating commit message with qwen2.5-coder...

──────────────────────────────────────────────────
feat: add JWT authentication middleware
──────────────────────────────────────────────────

Commit with this message? [Y/n] y
[main 3a1b2c3] feat: add JWT authentication middleware
 2 files changed, 45 insertions(+), 3 deletions(-)

That's it.

Configuration

commit-msg-ai stores config in ~/.config/commit-msg-ai/config.json.

# Set default model
commit-msg-ai config model mistral

# Set Ollama server URL (useful for remote setups)
commit-msg-ai config url http://192.168.1.50:11434

# View all config
commit-msg-ai config

# View a single value
commit-msg-ai config model

Override any config for a single run with flags:

commit-msg-ai --model codellama
commit-msg-ai --url http://other-server:11434

Commit message format

commit-msg-ai generates messages with only three prefixes:

  • feat: new features
  • fix: bug fixes
  • bc: breaking changes

Requirements

  • Python 3.9+
  • Ollama running locally (or on a reachable server)
  • At least one model pulled (ollama pull llama3.2)

About

Commit messages written by AI. Stage your changes, run one command, and get a meaningful commit message generated by a local LLM. No API keys. No cloud. Everything on your machine.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages