Your intelligent gateway between Git and AI β Half Bot, half Agent. π€π§
AI-Git-Bot is a lightweight, self-hostable gateway application that connects your Git platforms with AI providers. As a central hub it receives webhooks from Gitea, GitHub, GitHub Enterprise, GitLab, and Bitbucket Cloud, routes them to configurable AI providers, and writes the results back as code reviews, comments, or even entire pull requests β fully automated.
| Audience | Benefit |
|---|---|
| π§βπ» Developers who want a personalized code AI | Configure your own AI with custom system prompts β for code reviews that match your tech stack and coding standards. |
| π Teams with multiple projects & Git systems | Define an AI configuration once and reuse it across any number of repositories, projects, and Git platforms β through a single gateway. |
| π₯ Multi-pass reviews with different personas | Create multiple bots with different prompts: a security reviewer, a performance expert, a junior mentor β all on the same PR. |
| π Self-hosters with compliance requirements | Run everything on-premise with local LLMs (Ollama, llama.cpp). No code leaves your infrastructure β ideal for regulatory and compliance needs. |
| β‘ Lightweight AI implementation | A single Docker image, one PostgreSQL database β done. No complex infrastructure, no Kubernetes clusters required. |
AI-Git-Bot unites two worlds:
- As a Bot it automatically reacts to pull requests, answers questions in comments, and delivers context-aware inline reviews β like a reliable code-review partner that never sleeps.
- As an Agent it autonomously takes on entire issues: it analyzes the task, reads the source code, generates an implementation, validates the code with build tools, and creates a finished pull request β all on its own.
More than a bot. More than an agent. The intelligent gateway for your entire code review and implementation workflow.
AI-Git-Bot acts as a central gateway between your Git systems and AI providers:
graph LR
subgraph Git Platforms
Gitea
GitHub
GitLab
Bitbucket
end
subgraph AI Providers
Anthropic
OpenAI
Ollama
llama.cpp
end
Gateway["π AI-Git-Bot\n(Gateway)"]
DB["ποΈ PostgreSQL\n(Config & Sessions)"]
Gitea <--> Gateway
GitHub <--> Gateway
GitLab <--> Gateway
Bitbucket <--> Gateway
Gateway <--> Anthropic
Gateway <--> OpenAI
Gateway <--> Ollama
Gateway <--> llama.cpp
Gateway --> DB
Benefits of the gateway approach:
- π One configuration, many repositories β Set up once, use everywhere
- π Mix & match β Combine different AI providers with different Git platforms
- π‘οΈ Centralized control β Manage API keys, tokens, and prompts in one place
- π Unified monitoring β Dashboard with statistics across all bots
- π Encrypted secrets β API keys and tokens are stored with AES-256-GCM encryption
When a pull request is opened or updated, the bot automatically reviews the diff and posts feedback as a review comment. Large diffs are intelligently split into chunks with automatic retry on token limits.
Mention the bot (e.g. @ai_bot) in any PR comment to ask questions or request additional analysis. The bot acknowledges with π and responds using the full conversation history.
Mention the bot in an inline review comment on a specific code line. The bot includes the file context and diff hunk when generating its answer and replies directly inline.
Assign the bot to an issue β it analyzes the task, reads the source code, generates an implementation, validates with build tools, and creates a finished pull request. Fully autonomous.
See the Agent Documentation for details.
All configuration is managed through a web-based UI β no environment variables needed for AI providers, Git connections, or bot settings:
- Create multiple AI Integrations (Anthropic, OpenAI, Ollama, llama.cpp)
- Create multiple Git Integrations (Gitea, GitHub, GitHub Enterprise, GitLab, Bitbucket Cloud)
- Create multiple Bots, each with its own webhook URL, AI provider, and system prompt
- Dashboard with statistics and monitoring
| Provider | Default API URL | Suggested Models |
|---|---|---|
| Anthropic | https://api.anthropic.com |
claude-opus-4-6, claude-sonnet-4-6, claude-haiku-4-5-20251001 |
| OpenAI | https://api.openai.com |
gpt-5.4, gpt-5.3-codex, gpt-5.1-codex-max, gpt-5-codex |
| Ollama | http://localhost:11434 |
User-configured local models |
| llama.cpp | http://localhost:8081 |
User-configured GGUF models |
| Provider | Description |
|---|---|
| Gitea | Self-hosted Gitea instances |
| GitHub | github.com |
| GitHub Enterprise | Self-hosted GitHub Enterprise Server |
| GitLab | gitlab.com and self-managed GitLab CE/EE |
| Bitbucket Cloud | bitbucket.org |
- Session Management β Maintains conversation history per PR, persisted in the database, enabling context-aware follow-up reviews
- Configurable System Prompts β Select from built-in prompt templates or define custom prompts per bot
- AI-Driven Code Validation β The agent validates generated code with build tools (Maven, Gradle, npm, Go, Cargo, etc.)
- Health Endpoint β
/actuator/healthfor monitoring and orchestration
The bot is available as a Docker image on Docker Hub.
services:
app:
image: tmseidel/ai-git-bot:latest
ports:
- "8080:8080"
environment:
SPRING_PROFILES_ACTIVE: docker
DATABASE_URL: jdbc:postgresql://db:5432/giteabot
DATABASE_USERNAME: ${DATABASE_USERNAME:-giteabot}
DATABASE_PASSWORD: ${DATABASE_PASSWORD:-giteabot}
APP_ENCRYPTION_KEY: ${APP_ENCRYPTION_KEY:-change-me}
depends_on:
db:
condition: service_healthy
restart: unless-stopped
db:
image: postgres:17-alpine
environment:
POSTGRES_DB: giteabot
POSTGRES_USER: ${DATABASE_USERNAME:-giteabot}
POSTGRES_PASSWORD: ${DATABASE_PASSWORD:-giteabot}
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${DATABASE_USERNAME:-giteabot}"]
interval: 5s
timeout: 5s
retries: 5
restart: unless-stopped
volumes:
pgdata:docker compose up --build -dThis starts:
- The bot application on port 8080
- A PostgreSQL 17 database for persistence
- Navigate to
http://localhost:8080 - Create your administrator account
- Log in to access the management dashboard
-
Create an AI Integration:
- Go to AI Integrations β New Integration
- Select a provider (e.g. "anthropic")
- The API URL is auto-filled with the provider's default
- Select a model from the dropdown or enter a custom model name
- Enter your API key
-
Create a Git Integration:
- Go to Git Integrations β New Integration
- Select your provider (Gitea, GitHub, GitLab, or Bitbucket)
- Enter your Git server URL and API token
- See Gitea Setup, GitHub Setup, GitLab Setup, or Bitbucket Setup
-
Create a Bot:
- Go to Bots β New Bot
- Select your AI and Git integrations
- Optionally select a system prompt template
- Copy the generated Webhook URL
Configure webhooks in your Git provider to notify the bot about PR events.
- Gitea: See Gitea Setup
- GitHub: See GitHub Setup
- GitLab: See GitLab Setup
- Bitbucket Cloud: See Bitbucket Setup
See the User Guide for detailed instructions.
graph LR
Git["Git Platform<br/>(Gitea / GitHub / GitLab / Bitbucket)"]
Bot["AI-Git-Bot<br/>(Gateway)"]
AI["AI Provider<br/>(Anthropic / OpenAI / Ollama / llama.cpp)"]
DB["PostgreSQL"]
Git -- "Webhooks" --> Bot
Bot -- "Fetch diff, post reviews" --> Git
Bot -- "AI review requests" --> AI
Bot -- "Configuration & Sessions" --> DB
The bot receives webhooks from your Git provider, fetches PR diffs, sends them to the configured AI provider for review, and posts the results back. All configuration (AI integrations, Git integrations, bots) and conversation sessions are persisted in the database.
β‘οΈ See the Architecture Documentation for detailed component diagrams and request flows.
| Document | Description |
|---|---|
| User Guide | Web UI usage, creating bots and integrations |
| Architecture | Component diagrams, request flows, webhook routing |
| Agent | Autonomous issue implementation agent β setup and usage |
| Git Provider Setup | |
| Gitea Setup | Bot user creation, permissions, API tokens for Gitea |
| GitHub Setup | Bot user creation, permissions, PAT tokens for GitHub |
| GitLab Setup | Bot user creation, permissions, PAT tokens for GitLab |
| Bitbucket Setup | API tokens and webhook configuration for Bitbucket Cloud |
| AI Provider Setup | |
| Using Ollama | Running with local LLMs via Ollama |
| Using llama.cpp | Running with llama.cpp and GBNF grammar support |
| Deployment | |
| Deployment | Docker Compose deployment, environment variables |
| Local Development | Building, testing, project structure |
| Community | |
| Contributing | Contribution guidelines, coding conventions |
| Code of Conduct | Community standards |








