Skip to content

jpxoi/prgen

Repository files navigation

prgen

Generate a pull request title and description from git diff and commit history.

prgen CLI help

Author: Jean Paul Fernandez · github.com/jpxoi/prgen

Licensed under the GNU General Public License v3.0 (GPL-3.0-only).

Requirements

  • Python 3.10+
  • git on your PATH
  • One of:
    • OpenAI API access
    • Google Gemini API access
    • A local or remote Ollama server

Install

From PyPI:

pip install prgen-cli
# or
uv tool install prgen-cli

From a clone:

uv sync

Run the CLI with:

prgen --help

What It Does

prgen compares HEAD against a base ref, collects:

  • git diff <base>...HEAD
  • git log <base>..HEAD

It sends that context to an LLM and prints:

  • a PR title from <summary>...</summary>
  • a PR description from <body>...</body>

If the model does not return those tags, prgen prints the raw model output instead.

Providers

prgen supports three backends:

  • openai
  • gemini
  • ollama

--provider auto is the default.

In auto mode:

  • Gemini is chosen when GOOGLE_API_KEY is available
  • otherwise OpenAI is chosen when OPENAI_API_KEY is available
  • if neither key is configured, prgen exits with an error

Ollama is always explicit:

  • use --provider ollama
  • also pass --model <name>
  • --tier presets do not apply to Ollama

Quick Start

OpenAI:

prgen config set OPENAI_API_KEY sk-...
prgen

Gemini:

prgen config set GOOGLE_API_KEY your-key
prgen --provider gemini

Ollama:

prgen --provider ollama --model llama3.1:8b

If the Ollama model is missing locally, let prgen pull it:

prgen --provider ollama --model llama3.1:8b --pull

Configuration

Configuration lives in ~/.config/prgen/config.json.

If XDG_CONFIG_HOME is set, prgen uses:

$XDG_CONFIG_HOME/prgen/config.json

You can manage the file with:

prgen config
prgen config show
prgen config path

Supported persisted keys:

  • OPENAI_API_KEY
  • GOOGLE_API_KEY
  • OLLAMA_HOST
  • base
  • provider
  • tier

Notes:

  • OPENAI_API_KEY and GOOGLE_API_KEY are treated as secrets
  • OLLAMA_HOST is not secret and is merged into the environment if set
  • base, provider, and tier are optional CLI defaults
  • model and context are not persisted in config

Examples:

prgen config
prgen config set OPENAI_API_KEY sk-...
prgen config set GOOGLE_API_KEY your-key
prgen config set OLLAMA_HOST http://127.0.0.1:11434
prgen config set base origin/main
prgen config set provider ollama
prgen config set tier pro
prgen config unset OLLAMA_HOST
prgen config show

To read a secret value from stdin:

prgen config set OPENAI_API_KEY - < key.txt

Defaults

Built-in defaults:

Option Default Notes
--repo, -C current directory Uses the current git repo unless you point elsewhere
--base origin/main The ref must resolve locally
--provider auto Prefers Gemini over OpenAI when both keys exist
--tier default Used only for OpenAI and Gemini
--model unset Overrides tier selection; required for Ollama
--context none Extra text merged into the prompt
--pull false Only relevant for Ollama

Config-file defaults apply only when you omit the matching flag:

  • base
  • provider
  • tier

Explicit flags always win over the config file.

Current model presets:

  • OpenAI default: gpt-5-mini
  • OpenAI pro: gpt-5.4
  • Gemini default: gemini-3-flash-preview
  • Gemini pro: gemini-3.1-pro-preview

Usage

Basic usage:

prgen

Pick a different base:

prgen --base main

Run against another repository:

prgen -C ~/src/my-project

Override the model directly:

prgen --provider openai --model gpt-5.4
prgen --provider gemini --model gemini-3.1-pro-preview
prgen --provider ollama --model mistral-small3.1

Add extra context:

prgen --context "Focus on customer-facing impact and rollout notes."

Behavior Notes

  • prgen validates that --base resolves before generating anything
  • if there are no commits and no file changes vs the base ref, prgen exits with an error
  • when --provider ollama --pull is used, prgen can download the model automatically
  • when stderr is a TTY, loading states and Ollama downloads use Rich UI output

Development

Install local dependencies:

uv sync

Format and lint:

uv run ruff format .
uv run ruff check .

Run from the repo without installing globally:

uv run prgen --help

Install the local checkout as a global tool:

uv tool install .
# or
pipx install .

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages