Skip to content
/ dagu Public

A self-contained, local-first, and language-agnostic workflow engine, alternative to Airflow, Cron, etc. It aims to solve greater problems.

License

Notifications You must be signed in to change notification settings

dagu-org/dagu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dagu Logo

A portable, local-first, and language-agnostic workflow engine that runs anywhere

Latest Release Build Status Code Coverage Go Report Card Discord

DocumentationQuick StartFeaturesInstallationCommunity

What is Dagu?

Dagu solves the problem of complex workflow orchestration without requiring a dedicated infrastructure team. Unlike traditional workflow engines that demand databases, message queues, and careful operational overhead, Dagu runs as a single binary with zero external dependencies.

After managing hundreds of cron jobs across multiple servers, I built Dagu to bring sanity to workflow automation. It handles scheduling, dependencies, error recovery, and monitoring - everything you need for production workflows, without the complexity.

→ Learn the core concepts

Design Philosophy

  1. Single binary - No databases, no message brokers. Deploy anywhere in seconds.
  2. Language agnostic - Execute Python, Bash, Node.js, or any command. Your existing scripts just work.
  3. Local First - Define and execute workflows in a single, self-contained environment-no internet required. Whether you're prototyping on your laptop, running on IoT devices, or deploying to air-gapped on-premises servers, Dagu just works.
  4. Hierarchical DAGs - Compose workflows from smaller workflows.

Latest Release

v1.17.0 - June 17, 2025

Major performance improvements, hierarchical DAG execution, enhanced UI, and partial success status. Full changelog →

Features

Quick Start

# Install
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash

# Create dagu configuration directory
mkdir -p ~/.config/dagu/dags

# Create your first workflow
mkdir -p ~/.config/dagu/dags
cat > ~/.config/dagu/dags/hello.yaml << 'EOF'
steps:
  - name: hello
    command: echo "Hello from Dagu!"
    
  - name: world  
    command: echo "Running step 2"
EOF

# Execute it
dagu start hello

# Check the status
dagu status hello

# Start the web UI
dagu start-all
# Visit http://localhost:8080

Installation

macOS / Linux

# Latest
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash

# Specific version
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash -s -- --version v1.17.0

# Install to a specific directory
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash -s -- --prefix /path/to/install

# Homebrew
brew install dagu-org/brew/dagu

Docker

docker run -d \
  --name dagu \
  -p 8080:8080 \
  -v ~/.dagu:/dagu \
  ghcr.io/dagu-org/dagu:latest dagu start-all

Manual Download

Download from releases and add to PATH.

Documentation

Examples

Find more in our examples documentation.

ETL Pipeline

name: daily-etl
schedule: "0 2 * * *"
steps:
  - name: extract
    command: python extract.py
    output: DATA_FILE
    
  - name: validate
    command: python validate.py ${DATA_FILE}
    
  - name: transform
    command: python transform.py ${DATA_FILE}
    retryPolicy:
      limit: 3
      
  - name: load
    command: python load.py ${DATA_FILE}

Hierarchical Workflows

steps:
  - name: data-pipeline
    run: etl
    params: "ENV=prod REGION=us-west-2"
    
  - name: parallel-jobs
    run: batch
    parallel:
      items: ["job1", "job2", "job3"]
      maxConcurrency: 2
    params: "JOB=${ITEM}"
---
name: etl
params:
  - ENV
  - REGION
steps:
  - name: process
    command: python etl.py --env ${ENV} --region ${REGION}
---
name: batch
params:
  - JOB
steps:
  - name: process
    command: python process.py --job ${JOB}

Container-based Pipeline

name: ml-pipeline
steps:
  - name: prepare-data
    executor:
      type: docker
      config:
        image: python:3.11
        autoRemove: true
        volumes:
          - /data:/data
    command: python prepare.py
    
  - name: train-model
    executor:
      type: docker
      config:
        image: tensorflow/tensorflow:latest-gpu
    command: python train.py
    
  - name: deploy
    command: kubectl apply -f model-deployment.yaml
    preconditions:
      - condition: "`date +%u`"
        expected: "re:[1-5]"  # Weekdays only

Web Interface

Learn more about the Web UI →

Dashboard

Real-time monitoring of all workflows

DAG Editor

Visual workflow editor with validation

Log Viewer

Detailed execution logs with stdout/stderr separation

Use Cases

  • Data Engineering - ETL pipelines, data validation, warehouse loading
  • Machine Learning - Training pipelines, model deployment, experiment tracking
  • DevOps - CI/CD workflows, infrastructure automation, deployment orchestration
  • Media Processing - Video transcoding, image manipulation, content pipelines
  • Business Automation - Report generation, data synchronization, scheduled tasks

Building from Source

Prerequisites: Go 1.24+, Node.js, pnpm

git clone https://github.com/dagu-org/dagu.git && cd dagu
make build
make run

Contributing

Contributions are welcome. See our documentation for development setup.

Contributors

v1.17.0 Contributors

License

GNU GPLv3 - See LICENSE


If you find Dagu useful, please ⭐ star this repository