AI-powered Docker configuration generator that creates production-ready Dockerfiles, docker-compose.yml, and .dockerignore files for any repository.
https://dockerizer.dev by Dublyo
- Automatic Stack Detection - Detects language, framework, and version with 90%+ confidence
- Production-Ready Output - Multi-stage builds, non-root users, health checks, optimized layers
- AI Fallback - Uses OpenAI, Anthropic, or Ollama when detection confidence is low
- Interactive Setup - Guided CLI wizard for AI configuration and customization
- Build Plan - Nixpacks-inspired plan command for debugging and transparency
- Procfile Support - Respects Heroku-style Procfiles for start commands
- 26 Providers - Node.js, Python, Go, Rust, Ruby, PHP, Java, .NET, Elixir frameworks supported
- Agent Mode - Iterative analyze → generate → build → test → fix workflow
- MCP Server - Integration with Claude Code and Goose AI assistants
- Recipe System - YAML-based automation workflows
curl -fsSL https://dockerizer.dev/install.sh | shbrew install dublyo/tap/dockerizergo install github.com/dublyo/dockerizer/cmd/dockerizer@latestgit clone https://github.com/dublyo/dockerizer
cd dockerizer/src
make build
sudo make installdocker run --rm -v $(pwd):/app ghcr.io/dublyo/dockerizer /app# Interactive setup (recommended for first-time users)
dockerizer init
# Auto-detect and generate Docker configs
dockerizer .
# Preview build plan without generating files
dockerizer plan ./my-project
# Force AI generation for better results
ANTHROPIC_API_KEY=sk-ant-xxx dockerizer --ai ./my-project| Language | Frameworks | Confidence |
|---|---|---|
| Node.js | Next.js, NestJS, Nuxt, Remix, Astro, SvelteKit, Hono, Koa, Fastify, Express | 80-100% |
| Python | Django, FastAPI, Flask | 90-100% |
| Go | Gin, Fiber, Echo, Standard | 90% |
| Rust | Actix Web, Axum | 90% |
| Ruby | Rails | 85-90% |
| PHP | Laravel, Symfony | 85-95% |
| Java | Spring Boot, Quarkus | 90-95% |
| .NET | ASP.NET Core | 70-90% |
| Elixir | Phoenix | 80-90% |
Guided wizard that walks you through AI configuration and file generation.
dockerizer init ./my-projectFeatures:
- Auto-detects your stack and confirms
- Prompts for AI provider (Anthropic, OpenAI, Ollama)
- Securely accepts API keys
- Previews and generates files
- Saves configuration for future use
Output the resolved build plan as JSON or YAML without generating files. Inspired by Nixpacks.
dockerizer plan ./my-project
dockerizer plan --format yaml ./my-project
dockerizer plan --output plan.json ./my-projectThe plan includes:
- Detection results (language, framework, version, confidence)
- Build phases with commands
- Cache directories for faster builds
- Start command resolution
Generate Docker configuration files.
dockerizer ./my-projectFlags:
| Flag | Description |
|---|---|
--ai |
Force AI generation even for high-confidence detections |
-f, --force |
Overwrite existing files |
-o, --output |
Output directory (default: same as input) |
--no-compose |
Skip docker-compose.yml generation |
--no-ignore |
Skip .dockerignore generation |
--no-env |
Skip .env.example generation |
--json |
Output results as JSON |
-v, --verbose |
Enable verbose output |
-q, --quiet |
Suppress non-essential output |
Detect stack without generating files.
dockerizer detect ./my-project
dockerizer detect --all ./my-project # Show all candidatesRun in agent mode with iterative build/test/fix cycle.
OPENAI_API_KEY=sk-xxx dockerizer agent ./my-projectStart MCP server for AI assistant integration (stdio mode).
dockerizer serveConfigure in Claude Code (~/.claude.json):
{
"mcpServers": {
"dockerizer": {
"command": "dockerizer",
"args": ["serve"]
}
}
}Execute a YAML workflow recipe.
dockerizer recipe analyze --path ./my-project
dockerizer recipe generate --path ./my-project
dockerizer recipe build-and-test --path ./my-projectValidate Dockerfile syntax and best practices.
dockerizer validate ./DockerfileCustomize build behavior via environment variables (Nixpacks-inspired):
| Variable | Description |
|---|---|
DOCKERIZER_BUILD_CMD |
Override build command |
DOCKERIZER_INSTALL_CMD |
Override install/setup command |
DOCKERIZER_START_CMD |
Override start command |
DOCKERIZER_APT_PKGS |
Additional APT packages (comma-separated) |
Example:
DOCKERIZER_START_CMD="gunicorn app:app" dockerizer ./my-projectRunning dockerizer ./my-project generates:
| File | Description |
|---|---|
Dockerfile |
Multi-stage, optimized, production-ready |
docker-compose.yml |
Service definition with health checks, resource limits |
.dockerignore |
Language-specific exclusions |
.env.example |
Environment variables template |
Configure AI providers via environment variables:
export ANTHROPIC_API_KEY=sk-ant-xxx
export ANTHROPIC_MODEL=claude-3-5-haiku-20241022 # optionalexport OPENAI_API_KEY=sk-xxx
export OPENAI_MODEL=gpt-4o-mini # optionalexport OLLAMA_BASE_URL=http://localhost:11434 # optional
export OLLAMA_MODEL=llama3 # optionalAI is automatically used when:
- Detection confidence is below 80%
--aiflag is specified- No matching template exists for the detected stack
Create .dockerizer.yml in your project or ~/.dockerizer.yml globally:
ai:
provider: anthropic
model: claude-3-5-haiku-20241022
defaults:
include_compose: true
include_ignore: true
include_env: true
overwrite: false{
"version": "1.0",
"generator": "dockerizer v1.0.0",
"detection": {
"detected": true,
"language": "nodejs",
"framework": "nextjs",
"version": "14.0.0",
"confidence": 95,
"provider": "nextjs"
},
"phases": [
{
"name": "setup",
"commands": ["npm ci"]
},
{
"name": "build",
"depends_on": ["setup"],
"commands": ["npm run build"]
}
],
"cache_dirs": [
{"path": "/root/.npm", "id": "npm-cache"}
],
"start": {
"cmd": "node server.js"
}
}# Build stage
FROM node:20-alpine AS builder
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci
COPY . .
ENV NEXT_TELEMETRY_DISABLED=1
RUN npm run build
# Production stage
FROM node:20-alpine AS runner
WORKDIR /app
ENV NODE_ENV=production
RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs
COPY --from=builder /app/.next/standalone ./
COPY --from=builder /app/.next/static ./.next/static
COPY --from=builder /app/public ./public
USER nextjs
EXPOSE 3000
CMD ["node", "server.js"]
HEALTHCHECK --interval=30s --timeout=10s --start-period=40s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost:3000/api/health || exit 1| Feature | Dockerizer | Nixpacks |
|---|---|---|
| Output | Human-editable Dockerfile | OCI image only |
| AI Fallback | Yes (OpenAI, Anthropic, Ollama) | No |
| Interactive Setup | Yes (dockerizer init) |
No |
| Build Plan | Yes (dockerizer plan) |
Yes |
| Agent Mode | Yes (iterative fix) | No |
| MCP Integration | Yes | No |
| Procfile Support | Yes | Yes |
| Cache Directories | Yes | Yes |
| Env Overrides | Yes (DOCKERIZER_*) |
Yes (NIXPACKS_*) |
make build # Build binary to ./build/dockerizer
make build-all # Build for all platforms
make install # Install to /usr/local/bin
make clean # Remove build artifacts- Create provider file:
providers/<language>/<framework>.go - Implement the
providers.Providerinterface - Register in
providers/<language>/register.go - Add template in
internal/generator/generator.go
- More Node.js frameworks (Nuxt, NestJS, Remix, Astro, SvelteKit, Hono, Koa)
- Ruby/Rails support
- PHP/Laravel + Symfony support
- Java/Spring Boot support
- .NET/ASP.NET Core support
- Elixir/Phoenix support
- Kubernetes manifests generation
- GitHub Actions integration
- VS Code extension
- npx-style execution
MIT License - see LICENSE for details.
Contributions are welcome! Please open an issue or submit a pull request.