Skip to content

Latest commit

 

History

History

mwaa

Amazon Managed Workflows for Apache Airflow (MWAA) CDK Python project!

amazon-mwaa

This is a sample project for Python development with CDK.

The cdk.json file tells the CDK Toolkit how to execute your app.

This project is set up like a standard Python project. The initialization process also creates a virtualenv within this project, stored under the .venv directory. To create the virtualenv it assumes that there is a python3 (or python for Windows) executable in your path with access to the venv package. If for any reason the automatic creation of the virtualenv fails, you can create the virtualenv manually.

To manually create a virtualenv on MacOS and Linux:

$ python3 -m venv .venv

After the init process completes and the virtualenv is created, you can use the following step to activate your virtualenv.

$ source .venv/bin/activate

If you are a Windows platform, you would activate the virtualenv like this:

% .venv\Scripts\activate.bat

Once the virtualenv is activated, you can install the required dependencies.

(.venv) $ pip install -r requirements.txt

Before you deploy

Before you deploy this project, you should create an Amazon S3 bucket to store your Apache Airflow Directed Acyclic Graphs (DAGs), custom plugins in a plugins.zip file, and Python dependencies in a requirements.txt file. Check this Create an Amazon S3 bucket for Amazon MWAA

$ aws s3 mb s3://your-s3-bucket-for-airflow-dag-code --region region-name
$ aws s3api put-public-access-block --bucket your-s3-bucket-for-airflow-dag-code --public-access-block-configuration BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true
$ aws s3api put-bucket-versioning --bucket your-s3-bucket-for-airflow-dag-code --versioning-configuration Status=Enabled
$ aws s3api put-object --bucket your-s3-bucket-for-airflow-dag-code --key dags/
$ aws s3api put-object --bucket your-s3-bucket-for-airflow-dag-code --key requirements/requirements.txt

Deploy

At this point you can now synthesize the CloudFormation template for this code.

(.venv) $ export CDK_DEFAULT_ACCOUNT=$(aws sts get-caller-identity --query Account --output text)
(.venv) $ export CDK_DEFAULT_REGION=$(curl -s 169.254.169.254/latest/dynamic/instance-identity/document | jq -r .region)
(.venv) $ cdk -c s3_bucket_for_dag_code='your-s3-bucket-for-airflow-dag-code' \
              -c airflow_env_name='your-airflow-env-name' \
              synth --all

Use cdk deploy command to create the stack shown above.

(.venv) $ cdk -c s3_bucket_for_dag_code='your-s3-bucket-for-airflow-dag-code' \
              -c airflow_env_name='your-airflow-env-name' \
              deploy --all

To add additional dependencies, for example other CDK libraries, just add them to your setup.py file and rerun the pip install -r requirements.txt command.

Clean Up

Delete the CloudFormation stack by running the below command.

(.venv) $ cdk destroy --force --all

Useful commands

  • cdk ls list all stacks in the app
  • cdk synth emits the synthesized CloudFormation template
  • cdk deploy deploy this stack to your default AWS account/region
  • cdk diff compare deployed stack with current state
  • cdk docs open CDK documentation

References

Learn more

Tips

  • To update requirements.txt, run the commands like this:
    $ obj_version=$(aws s3api list-object-versions --bucket <i>your-s3-bucket-for-airflow-requirements</i> --prefix 'requirements/requirements.txt' | jq '.Versions[0].VersionId' | sed -e "s/\"//g")
    $ echo ${obj_version}
    $ aws mwaa update-environment \
        --region <i>region-name</i> \
        --name <i>your-airflow-environment</i> \
        --requirements-s3-object-version ${obj_version}
    
  • sample requirements.txt
    apache-airflow-providers-elasticsearch==1.0.3
    apache-airflow-providers-redis==1.0.1
    apache-airflow-providers-google==2.2.0
    apache-airflow-providers-mysql==1.1.0
    

Enjoy!