A visual DAG builder for Apache Airflow. Drag, drop, and connect operators on a canvas β generate valid, idiomatic Python DAG files in both Traditional and Taskflow API syntax.
- Visual drag-and-drop canvas β ReactFlow-powered node graph with auto-validation for a seamless editing experience.
- Dual syntax generation β Traditional (
with DAG(...)) and Taskflow API (@dag+@task). Toggle modes instantly. - 11 built-in operators β Python, Bash, S3, BigQuery, Postgres, HTTP, SQL sensors, and more natively supported.
- Extensible registry β Add new operators with just YAML + Jinja2 parameters. Zero Python backend changes required!
- Task Groups β Create and manage Airflow Task Groups directly from the canvas UI. Drag task nodes onto a Task Group to assign them as children.
- Keyboard Shortcuts β Accelerate DAG building with standard keyboard shortcuts (
Ctrl+Z/Cmd+Zfor Undo,Deletefor node removal,Escto deselect, etc.). - Dark / Light theme β Elegant, user-customizable UI themes toggled with one click and persisted via localStorage.
- Live validation β Instant cycle detection, required parameter validation, and
dag_idchecks as you build. - Save & load workspaces β Persist DAG graphs to local storage or backend service and reload them later for continued editing.
- Code preview β See generated code instantly as you place nodes and visually construct your workflow, with one-click copy to clipboard.
# Backend
cd backend
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
uvicorn main:app --host 0.0.0.0 --port 8000 --reload
# Frontend (separate terminal)
cd frontend
npm install
npm run dev| Service | URL |
|---|---|
| DAG Builder UI | http://localhost:3000 |
| API + Swagger | http://localhost:8000/docs |
docker compose up -d| Service | URL |
|---|---|
| DAG Builder UI | http://localhost:3000 |
| DAG Builder API | http://localhost:8000 |
| Airflow Webserver | http://localhost:8080 |
Default Airflow credentials: admin / admin
| Category | Operator | Provider |
|---|---|---|
| Python/Bash | PythonOperator | core |
| Python/Bash | BashOperator | core |
| Python/Bash | BranchPythonOperator | core |
| Sensors | S3KeySensor | apache-airflow-providers-amazon |
| Sensors | HttpSensor | apache-airflow-providers-http |
| Sensors | SqlSensor | apache-airflow-providers-common-sql |
| Transfers | PostgresOperator | apache-airflow-providers-postgres |
| Transfers | BigQueryInsertJobOperator | apache-airflow-providers-google |
| Transfers | S3ToGCSOperator | apache-airflow-providers-google |
| Flow Control | TriggerDagRunOperator | core |
| Flow Control | ShortCircuitOperator | core |
No Python code changes required β just YAML + Jinja2.
- id: snowflake_operator
label: Snowflake SQL
category: transfers
import_path: airflow.providers.snowflake.operators.snowflake
class_name: SnowflakeOperator
provider_package: apache-airflow-providers-snowflake
template: snowflake_operator.j2
required_params:
- name: task_id
type: string
label: Task ID
- name: sql
type: text
label: SQL statement
optional_params:
- name: snowflake_conn_id
type: connection
label: Snowflake connection
default: snowflake_default
- name: warehouse
type: string
label: Warehouse
default: ""{{ task_id }} = SnowflakeOperator(
task_id="{{ task_id }}",
sql={{ sql | to_python_string }},
snowflake_conn_id="{{ snowflake_conn_id | default("snowflake_default") }}",
{% if warehouse %}warehouse="{{ warehouse }}",{% endif %}
)The operator will automatically appear in the left palette. No frontend changes needed.
Supported param types: string, int, bool, text (multi-line), dict, list, connection
Available Jinja2 filters: python_bool, to_python_string, format_start_date, tojson
with DAG(dag_id="my_dag", ...) as dag:
extract = PythonOperator(task_id="extract", python_callable=extract_fn)
transform = PythonOperator(task_id="transform", python_callable=transform_fn)
extract >> transform@dag(dag_id="my_dag", ...)
def my_dag():
@task()
def extract(**context):
pass
@task()
def transform(**context):
pass
extract_result = extract()
transform_result = transform()
my_dag()Toggle between modes using the Traditional / Taskflow pill in the toolbar. Sensors and transfers remain as traditional operators inside the @dag function (valid Airflow mixed-mode).
| Method | Path | Description |
|---|---|---|
POST |
/api/v1/dags/generate |
Generate Python code from graph JSON |
POST |
/api/v1/dags/validate |
Validate graph (cycles, required params) |
POST |
/api/v1/dags/save |
Persist graph to storage |
GET |
/api/v1/dags/ |
List all saved DAGs |
GET |
/api/v1/dags/{id} |
Load a saved DAG graph |
DELETE |
/api/v1/dags/{id} |
Delete a saved DAG |
GET |
/api/v1/operators/ |
Operator registry (for palette) |
cd backend
source venv/bin/activate
python -m unittest discover tests -v
# 36 tests, all passingBrowser (React + ReactFlow + Zustand)
β drag-drop canvas, config panel, code preview
β
βΌ REST API
FastAPI Backend
βββ /generate β DAGGenerator β Jinja2 templates β .py source
βββ /validate β cycle detection, required params, dag_id check
βββ /save β JSON file store (swappable for PostgreSQL)
βββ /operators β registry.yaml β palette data
β
βΌ direct write / git commit
Apache Airflow
βββ Scheduler (picks up .py files from dags/ folder)
βββ Webserver (run, monitor, backfill)
- Dark mode β Warm amber/stone palette (default)
- Light mode β Clean stone/white palette
- Font β Outfit geometric sans-serif + JetBrains Mono for code
- CSS Variables β All colors tokenized in
globals.css, switchable viadata-themeattribute
- Change
AIRFLOW__WEBSERVER__SECRET_KEYin docker-compose.yml - Set strong Postgres password
- Configure CORS
allow_originsinbackend/main.py - Mount persistent volume for
backend/data/ - Set up Airflow connections (AWS, GCP, etc.) via Airflow UI
- Add authentication to API (JWT middleware recommended)
All notable changes to Airflow Studio are documented in this section.
- Keyboard Shortcuts: Canvas now supports keyboard shortcuts for faster DAG building.
Ctrl+Z/Cmd+Zβ Undo last actionCtrl+Y/Cmd+Shift+Zβ RedoDelete/Backspaceβ Delete selected node or edgeCtrl+A/Cmd+Aβ Select all nodesEscapeβ Deselect / close right panel
- Task Groups: Users can now create and manage Airflow Task Groups directly from the canvas UI.
- New
task_groupnode type available in the operator panel - Drag any task node onto a Task Group to assign it as a child; React Flow handles parent-child rendering automatically
- Task Group node renders with dashed border and translucent background for clear visual distinction
- Right panel support for
group_idandtooltipproperties - Code generator (
generator.py) updated to emit validwith TaskGroup(...) as <id>:Python blocks with correct indentation from airflow.utils.task_group import TaskGroupinjected automatically when Task Groups are present in the DAG
- New
- Core canvas with drag-and-drop DAG building
- Operator registry with standard Airflow operators
- Python code generation via
generator.py - Right panel for task configuration
MIT