Dagu solves the problem of complex workflow orchestration without requiring a dedicated infrastructure team. Unlike traditional workflow engines that demand databases, message queues, and careful operational overhead, Dagu runs as a single binary with zero external dependencies.
After managing hundreds of cron jobs across multiple servers, I built Dagu to bring sanity to workflow automation. It handles scheduling, dependencies, error recovery, and monitoring - everything you need for production workflows, without the complexity.
- Single binary - No databases, no message brokers. Deploy anywhere in seconds.
- Language agnostic - Execute Python, Bash, Node.js, or any command. Your existing scripts just work.
- Local First - Define and execute workflows in a single, self-contained environment-no internet required. Whether you're prototyping on your laptop, running on IoT devices, or deploying to air-gapped on-premises servers, Dagu just works.
- Hierarchical DAGs - Compose workflows from smaller workflows.
v1.17.0 - June 17, 2025
Major performance improvements, hierarchical DAG execution, enhanced UI, and partial success status. Full changelog →
- DAG definition - Express complex dependencies in readable YAML
- Scheduling - Cron expressions with timezone support
- Queueing - Control concurrency with named queues
- Error handling - Retries, failure handlers, cleanup hooks
- Conditional execution - Run steps based on conditions
- Parallel execution - Control concurrent step execution
- Variables & Parameters - Pass data between steps, parameterize workflows
- Docker support - Run steps in containers
- SSH executor - Execute commands on remote hosts
- HTTP requests - Integrate with APIs
- Email notifications - SMTP integration
- Hierarchical workflows - Nest DAGs to any depth
- Authentication - Basic auth, API tokens, TLS
- Web UI - Real-time monitoring and control
- REST API - Full programmatic access
# Install
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash
# Create dagu configuration directory
mkdir -p ~/.config/dagu/dags
# Create your first workflow
mkdir -p ~/.config/dagu/dags
cat > ~/.config/dagu/dags/hello.yaml << 'EOF'
steps:
- name: hello
command: echo "Hello from Dagu!"
- name: world
command: echo "Running step 2"
EOF
# Execute it
dagu start hello
# Check the status
dagu status hello
# Start the web UI
dagu start-all
# Visit http://localhost:8080
# Latest
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash
# Specific version
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash -s -- --version v1.17.0
# Install to a specific directory
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash -s -- --prefix /path/to/install
# Homebrew
brew install dagu-org/brew/dagu
docker run -d \
--name dagu \
-p 8080:8080 \
-v ~/.dagu:/dagu \
ghcr.io/dagu-org/dagu:latest dagu start-all
Download from releases and add to PATH.
- Getting Started - Tutorial and first steps
- Core Concepts - Architecture and design
- Writing Workflows - Complete authoring guide
- CLI Reference - Command-line usage
- API Reference - REST API documentation
- Configuration - Configuration options
Find more in our examples documentation.
name: daily-etl
schedule: "0 2 * * *"
steps:
- name: extract
command: python extract.py
output: DATA_FILE
- name: validate
command: python validate.py ${DATA_FILE}
- name: transform
command: python transform.py ${DATA_FILE}
retryPolicy:
limit: 3
- name: load
command: python load.py ${DATA_FILE}
steps:
- name: data-pipeline
run: etl
params: "ENV=prod REGION=us-west-2"
- name: parallel-jobs
run: batch
parallel:
items: ["job1", "job2", "job3"]
maxConcurrency: 2
params: "JOB=${ITEM}"
---
name: etl
params:
- ENV
- REGION
steps:
- name: process
command: python etl.py --env ${ENV} --region ${REGION}
---
name: batch
params:
- JOB
steps:
- name: process
command: python process.py --job ${JOB}
name: ml-pipeline
steps:
- name: prepare-data
executor:
type: docker
config:
image: python:3.11
autoRemove: true
volumes:
- /data:/data
command: python prepare.py
- name: train-model
executor:
type: docker
config:
image: tensorflow/tensorflow:latest-gpu
command: python train.py
- name: deploy
command: kubectl apply -f model-deployment.yaml
preconditions:
- condition: "`date +%u`"
expected: "re:[1-5]" # Weekdays only
- Data Engineering - ETL pipelines, data validation, warehouse loading
- Machine Learning - Training pipelines, model deployment, experiment tracking
- DevOps - CI/CD workflows, infrastructure automation, deployment orchestration
- Media Processing - Video transcoding, image manipulation, content pipelines
- Business Automation - Report generation, data synchronization, scheduled tasks
Prerequisites: Go 1.24+, Node.js, pnpm
git clone https://github.com/dagu-org/dagu.git && cd dagu
make build
make run
Contributions are welcome. See our documentation for development setup.
- @jerry-yuan - Docker optimization
- @vnghia - Container enhancements ([#898])
- @thefishhat - Repeat policies & partial success ([#1011])
- @kriyanshii - Queue functionality
- @ghansham - Code reviews
GNU GPLv3 - See LICENSE
If you find Dagu useful, please ⭐ star this repository