This is a demo for SAM, including 2 parts,
- Local testing and deployment for Serverless application
- Build a CICD pipeline for Serverless application
For what is SAM, refer to, https://aws.amazon.com/serverless/sam/
Below is the clarification for the files used in this demo.
.
├── README.md
├── buildspec.yml # Define what to run in codebuild
├── deploy.sh # Wrapper for deploying Serverless stack
├── env.config # A file to define environment parameters
├── event.json # The event definition file, in this case, it's the message for an API Gateway event
├── hello_world # hello world example, generated by 'sam init', then update with your content
│ ├── __init__.py
│ ├── app.py
│ └── requirements.txt
├── output.sh # Wrapper for looking at the output of CloudFormation, to get the entry of API Gateway
├── package.sh # Wrapper for packing, which is to tranform the SAM template into CloudFormation template
├── packaged.yaml # The CloudFormation templated transformed from SAM template
├── requirement.txt # Dependencies for the demo Python App
├── template.yaml # The SAM template
└── tests # A sample unit test
└── unit
├── __init__.py
└── test_handler.py
Install SAM CLI and prepare your credentials following below instructions, https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html Install Python3 Install Docker
git clone https://github.com/totorochina/sam-cicd
Prepare and validate your environment
cd sam-cicd
sam build
Local trigger Lambda
sam local invoke
Start a local API Gateway for testing
sam local start-api
and validate it
curl http://127.0.0.1:3000/hello
Now you can deploy the API Gateway and Lambda stack directly through SAM, but before that, you have to transform SAM template into CloudFormation template,
bash package.sh
And what inside is,
#!/bin/bash
source env.config
sam validate && \
sam build && \
sam package \
--output-template-file packaged.yaml \
--s3-bucket $S3_BUCKET
After you get the CloudFormation template packaged.yaml, you can deploy the stack
bash deploy.sh
I made a wrapper, and this is what inside,
#!/bin/bash
source env.config # Use this to control your settings, regions, and environment(Test, Staging, Prod)
# Deploy defined stack to a region
sam deploy \
--template-file packaged.yaml \
--stack-name $STACK \
--capabilities CAPABILITY_IAM \
--region $REGION
# You may have to update ?OutputKey== to what you defined template.yaml output
aws cloudformation describe-stacks --stack-name $STACK \
--query 'Stacks[0].Outputs[?OutputKey==`HelloWorldApi`].OutputValue' \
--output text \
--region $REGION
To setup canary deployment, update template.yaml for "DeploymentPreference"
DeploymentPreference:
Type: Linear10PercentEvery1Minute
#Type: AllAtOnce
Type:
'Fn::Sub': '${DeploymentPreference}'
update the demo application, for showcase purpose, update verison number.
vi hello_world/app.py
change
"message": "hello world! v1.1",
to
"message": "hello world! v1.2",
Then package and deploy again,
bash package.sh
bash deploy.sh
The deployment take times for a Canary deployment, depending on the "DeploymentPreference" you set. To watch for the process, get the API Gateway entry,
bash output.sh
Then curl and watch the result,
while true
do
curl "<API_GATEWAY_ENDPOINT_URL>"
echo
done
Then you can see the version number cumulatively change from v1.1 to v1.2.
We can also use SAM and Codepipeline to build a CI/CD pipeline
Create New pipeline with Codepipeline, Choose pipeline settings -> Remain defaults
Source provider -> CodeCommit/GitHub Repository Name -> sam-cicd Branch Name -> master
Build provider -> CodeBuild Region -> the same region with your stack Create Project -> Ensure checked 'Using buildspec', any Linux environment should work for this demo Environment Variable -> Define 'S3_BUCKET' & 'REGION'
Deploy provider -> AWS CloudFormation Action Mode -> Create or update a stack Tempalte -> BuildArtifact::packaged.yaml Capabilities -> CAPABILITY_IAM & CAPABILITY_AUTO_EXPAND Role -> Create a role with below policies
{
"Statement": [
{
"Action": [
"apigateway:*",
"codedeploy:*",
"lambda:*",
"cloudformation:CreateChangeSet",
"iam:GetRole",
"iam:CreateRole",
"iam:DeleteRole",
"iam:PutRolePolicy",
"iam:AttachRolePolicy",
"iam:DeleteRolePolicy",
"iam:DetachRolePolicy",
"iam:PassRole",
"s3:GetObjectVersion",
"s3:GetBucketVersioning"
],
"Resource": "*",
"Effect": "Allow"
}
After that, test run by updating app.py then git push.
You may failed in CodeBuild as the default role it created for you automatically may not contain access to S3 bucket you defined to store artifacts. If that happended, attach AmazonS3FullAccess to the Role used by CodeBuild.
If you need to make a multi-stage pipeline, e.g. Deploying AllAtOnce for Staging environment while using Linear deployment for Production, you can update the Deploy stage for each environment for each deployment stage with different Parameter Overrides
{'Env': 'prod', 'DeploymentPreference': 'Canary10Percent5Minutes'}
This is because below were defined in the template.yaml
Parameters:
Env:
Type: String
Default: prod
AllowedValues:
- prod
- staging
Description: Define prod/staging by Parameters}
DeploymentPreference:
Type: String
Default: AllAtOnce
AllowedValues:
- Canary10Percent5Minutes
- Canary10Percent10Minutes
- Canary10Percent15Minutes
- Canary10Percent30Minutes
- Linear10PercentEvery1Minute
- Linear10PercentEvery2Minute
- Linear10PercentEvery3Minute
- Linear10PercentEvery10Minute
- AllAtOnce
DeploymentPreference:
#Type: Linear10PercentEvery1Minute
#Type: AllAtOnce
Type:
'Fn::Sub': '${DeploymentPreference}'
This library is licensed under the MIT-0 License. See the LICENSE file.