AI-powered git workflow tool. Generates meaningful commit messages and pull request descriptions from your diffs — using any LLM you already have access to.
- Reads your staged
git diffand generates commit message suggestions (3 by default, configurable) - Generates PR titles and descriptions from your branch's commits (
gitai pr) - Interactive selection: pick a suggestion or write your own
- Supports multiple providers: Ollama (local), OpenAI, Anthropic, Gemini, and more
- Two commit styles: Conventional Commits or free-form
- Optional emoji (gitmoji) support
- Automatically truncates large diffs to fit model context limits
pip install gitaiRequires Python 3.11+.
# 1. Stage your changes
git add .
# 2. Run gitai
gitai commitgitai reads the diff, calls your configured LLM, and presents 3 suggestions to choose from.
gitai commit Generate commit message suggestions for staged changes
gitai commit --push Push to remote automatically after committing
gitai commit -n 5 Generate 5 suggestions instead of the default 3
gitai commit --suggestions 5 Same as above
gitai pr Push branch and generate a PR title + description
gitai pr development Compare against a specific base branch
gitai pr --full-diff Use a flat diff instead of per-commit breakdown
gitai pr --minimal Output title + bullet list only
gitai pr --template TEMPLATE Fill in a custom PR template file
gitai config View and update settings
gitai --version Show version
gitai --help Show help
Run gitai config to update settings interactively. Settings are stored in ~/.gitai.toml.
| Key | Default | Description |
|---|---|---|
provider |
ollama |
LLM provider |
model |
llama3.2 |
Model name |
ollama_url |
http://localhost:11434 |
Ollama API base URL (Ollama only) |
commit_style |
conventional |
conventional or free-form |
emoji |
false |
Prefix suggestions with gitmoji |
num_suggestions |
3 |
Number of suggestions to generate |
max_diff_chars |
12000 |
Max diff size sent to the model (truncates if exceeded) |
| Provider | provider value |
Example model value |
API key env var |
|---|---|---|---|
| Ollama (local) | ollama |
llama3.2, mistral |
— |
| Anthropic | anthropic |
claude-sonnet-4-6, claude-haiku-4-5-20251001 |
ANTHROPIC_API_KEY |
| OpenAI | openai |
gpt-4o, gpt-4o-mini |
OPENAI_API_KEY |
| Gemini | gemini |
gemini-2.0-flash |
GEMINI_API_KEY |
For cloud providers, set the API key in your shell profile:
bash/zsh (~/.bashrc or ~/.zshrc):
export ANTHROPIC_API_KEY=sk-ant-...PowerShell ($PROFILE):
$env:ANTHROPIC_API_KEY="sk-ant-..."provider = "anthropic"
model = "claude-haiku-4-5-20251001"
commit_style = "conventional"
emoji = false
ollama_url = "http://localhost:11434"gitai pr pushes your current branch and generates a ready-to-copy PR title and description based on your commits.
# Auto-detect base branch (main/master/develop) and generate PR description
gitai pr
# Compare against a specific base branch
gitai pr development
# Use a flat diff instead of per-commit breakdown (good for large PRs)
gitai pr --full-diff
# Minimal output: title + bullet list only
gitai pr --minimal
# Fill in your team's PR template
gitai pr --template .github/PULL_REQUEST_TEMPLATE.mdIf a .github/PULL_REQUEST_TEMPLATE.md exists in your repo, gitai will use it automatically.
If you want to run fully offline with Ollama:
- Install Ollama
- Pull a model:
ollama pull llama3.2 - Run
gitai commit— no API key needed
- Allow configuring the number of suggestions generated
- Add
gitai prcommand for PR description generation - Support unstaged changes with an optional
--allflag
