A complete Apache Airflow deployment with Docker Compose and integrated MCP (Model Context Protocol) Server for AI-powered workflow management through Warp terminal.
# Navigate to project directory
cd airflow-deploy-mcp
# Start all services
./start.sh
# Access Airflow UI at http://localhost:8080
# Username: admin
# Password: admin123- PostgreSQL: Metadata database
- Redis: Message broker for Celery
- Airflow Webserver: Web UI (port 8080)
- Airflow Scheduler: DAG scheduling and orchestration
- Airflow Worker: Celery executor for task execution
- Airflow Triggerer: Support for deferrable operators
- 🤖 AI Integration: Control Airflow through AI assistants in Warp terminal
- 📊 DAG Management: List, trigger, and monitor DAGs
- 📈 Real-time Monitoring: Check DAG run status and task logs
- 🔧 Configuration: Manage Airflow variables and connections
- 🔐 Security: Authenticated access via Airflow REST API
list_dags- List all available DAGsget_dag- Get detailed DAG informationtrigger_dag- Trigger DAG runs with configurationpause_dag- Pause a DAG to prevent new runsunpause_dag- Unpause a DAG to allow schedulingget_dag_runs- View DAG run historyget_dag_run_status- Check specific run status
list_dag_files- List all DAG files in dags folderread_dag_file- Read content of a DAG fileupload_dag_file- Upload new DAG or update existingdelete_dag_file- Delete DAG filesvalidate_dag_file- Validate Python syntax and DAG structure
get_task_instances- List tasks in a DAG runget_task_logs- Retrieve task execution logs
list_variables- List Airflow variablesget_variable- Get variable valueset_variable- Set or update variablesdelete_variable- Delete variableslist_connections- List Airflow connections
- Docker Desktop: Version 20.10 or higher
- Docker Compose: Version 2.0 or higher
- Memory: Minimum 4GB RAM recommended
- Disk Space: At least 5GB free space
- Warp Terminal: For MCP integration (optional)
cd airflow-deploy-mcpReview and customize the .env file:
cat .envImportant: Change default passwords for production environments!
Using the provided script:
./start.shOr manually with Docker Compose:
docker-compose up -dFirst-time startup takes 2-3 minutes to:
- Download Docker images
- Initialize Airflow database
- Create admin user
- Start all services
docker-compose psAll services should show as "healthy" or "running".
- Open browser to: http://localhost:8080
- Login with:
- Username:
admin - Password:
admin123
- Username:
The project includes two example DAGs:
- Purpose: Simple DAG for testing basic functionality
- Tasks: bash_hello → python_hello → print_context → goodbye
- Schedule: Manual trigger only
- Features: Demonstrates basic bash and Python operators
How to test:
- Navigate to DAGs page in Airflow UI
- Find
example_hello_world - Click the Play button (
▶️ ) to trigger - View logs in Graph view
- Purpose: Complete ETL workflow demonstration
- Tasks: start → extract → transform → validate → load → notify → end
- Schedule: Daily (or manual trigger)
- Features: Shows XCom usage for inter-task data passing
How to test:
- Go to DAGs page
- Find
example_etl_pipeline - Toggle the On/Off switch to "On" (for auto-scheduling)
- Click Play button for manual trigger
- Monitor execution in Graph or Grid view
# All services
docker-compose logs -f
# Specific service
docker-compose logs -f airflow-webserver
docker-compose logs -f mcp-server
docker-compose logs -f airflow-schedulerUsing the provided script:
./stop.shOr manually:
# Keep data
docker-compose down
# Remove all data and volumes
docker-compose down -v- View MCP configuration:
cat mcp-config.json-
Add to Warp:
- Open Warp Settings
- Navigate to "Features" → "MCP Servers"
- Add the configuration from
mcp-config.json
-
Restart Warp terminal
Add this configuration to Warp MCP settings:
{
"mcpServers": {
"airflow": {
"command": "docker",
"args": [
"exec",
"-i",
"mcp-server",
"python",
"/app/server.py"
],
"env": {
"AIRFLOW_BASE_URL": "http://localhost:8080",
"AIRFLOW_API_USERNAME": "admin",
"AIRFLOW_API_PASSWORD": "admin123"
}
}
}
}After configuration, you can use AI commands in Warp:
# List all DAGs
"Show me all Airflow DAGs"
# Trigger a DAG
"Trigger the example_hello_world DAG"
# Check DAG status
"What's the status of the latest run of example_etl_pipeline?"
# Get task logs
"Show me logs for the transform task in example_etl_pipeline"
# Manage variables
"Set Airflow variable API_KEY to test123"
"What's the value of API_KEY variable?"
Scenario 1: Trigger DAG with Configuration
AI: "Trigger example_hello_world with config message='Hello from AI'"
Scenario 2: Monitor DAG Run
AI: "Show me the last 5 runs of example_etl_pipeline and their status"
Scenario 3: Debug Failed Task
AI: "The ETL pipeline failed, show me logs from the transform task"
Scenario 4: Manage Configuration
AI: "List all Airflow variables and their values"
Scenario 5: Upload a New DAG
AI: "Create a new DAG file called my_custom_dag.py with a simple hello world task"
Scenario 6: List and Read DAG Files
AI: "Show me all DAG files and then read the content of example_hello_world.py"
Scenario 7: Validate DAG Before Upload
AI: "Validate this DAG code before I upload it: [paste code]"
Scenario 8: Pause/Unpause DAG
AI: "Pause the example_etl_pipeline DAG temporarily"
airflow-deploy-mcp/
├── docker-compose.yaml # Docker Compose configuration
├── .env # Environment variables (gitignored)
├── .env.example # Environment template
├── .gitignore # Git ignore patterns
├── mcp-config.json # MCP Server configuration for Warp
├── README.md # This file
├── LICENSE # MIT License
├── start.sh # Startup script
├── stop.sh # Shutdown script
│
├── dags/ # Airflow DAGs directory
│ ├── example_hello_world.py
│ └── example_etl_pipeline.py
│
├── logs/ # Airflow logs (auto-generated)
├── plugins/ # Airflow plugins (optional)
├── config/ # Airflow configuration (optional)
│
└── mcp-server/ # MCP Server implementation
├── Dockerfile # MCP Server Docker image
├── requirements.txt # Python dependencies
└── server.py # MCP Server implementation
Check Docker:
docker infoCheck Memory Allocation:
- Docker Desktop → Settings → Resources
- Recommended: Memory >= 4GB
Check Initialization Logs:
docker-compose logs airflow-init
docker-compose logs airflow-webserverWait for initialization to complete:
docker-compose logs -f airflow-webserverLook for: "Running the Gunicorn Server"
Check for port conflicts:
lsof -i :8080Verify DAG files:
ls -la dags/Check scheduler logs:
docker-compose logs -f airflow-schedulerRefresh DAGs:
- Airflow UI → Admin → Refresh all
Check MCP container status:
docker-compose ps mcp-server
docker-compose logs mcp-serverTest MCP server connection:
docker exec -it mcp-server python -c "import requests; print(requests.get('http://localhost:3000/health').status_code)"Verify Warp configuration:
- Check
mcp-config.jsonsyntax - Restart Warp terminal
- Review Warp logs
Reset database (using script):
./stop.sh
# Select "Yes" when prompted to remove volumes
./start.shManual database reset:
docker-compose down -v
docker volume rm airflow-deploy-mcp_postgres-db-volume
./start.shFix folder permissions:
mkdir -p logs dags plugins
chmod -R 755 logs dags pluginsVerify AIRFLOW_UID:
grep AIRFLOW_UID .env
# Should match your user ID
echo $(id -u)-
Change Default Passwords:
- Admin password in
.env - PostgreSQL credentials
- Fernet key and secret key
- Admin password in
-
Enable HTTPS:
- Use reverse proxy (nginx, traefik)
- Configure SSL certificates
-
Network Security:
- Use Docker network isolation
- Configure firewall rules
- Restrict port access
-
Secrets Management:
- Use Airflow Secrets Backend
- Implement Docker secrets or environment encryption
- Never store sensitive data in DAG code
- Usage Examples - Practical examples of using MCP Server for DAG management
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
If you encounter issues or need help:
- Check the Troubleshooting section
- Review logs with
docker-compose logs - Check Airflow UI → Browse → Task Instance Logs
Happy Orchestrating! 🎉