Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 

CQRS Design Pattern

This workflow demonstrates how to implement the command query responsibility segregation (CQRS) design pattern using AWS Step Functions.

Workflow is triggered by Amazon EventBridge events. Based on event properties, a state routes events to respective handlers. An order creation command is handled directly in the workflow and persisted in the Amazon DynamoDB table. Created orders are captured by an Amazon DynamoDB stream, transformed by an AWS Lambda function and stored in Amazon Aurora Serverless v2 MySQL-Compatible Edition database. Query events received are then routed to respective Lambda functions, which gets the data from Aurora Serverless v2 and sends the results to EventBridge.

Important: this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the AWS Pricing page for details. You are responsible for any AWS costs incurred. No warranty is implied in this example.

Requirements

Deployment Instructions

  1. Create a new directory, navigate to that directory in a terminal and clone the GitHub repository:

    git clone https://github.com/aws-samples/step-functions-workflows-collection
    
  2. Change directory to the pattern directory:

    cd cqrs
    
  3. From the command line, use AWS SAM to deploy the AWS resources for the workflow as specified in the template.yaml file:

    sam build
    sam deploy --guided
    
  4. During the prompts:

    • Enter a stack name
    • Enter the desired AWS Region
    • Allow SAM CLI to create IAM roles with the required permissions.

    Once you have run sam deploy --guided mode once and saved arguments to a configuration file (samconfig.toml), you can use sam deploy in future to use these defaults.

  5. Note the outputs from the SAM deployment process. These contain the resource names and/or ARNs which are used for testing.

How it works

The scenario includes a basic order creation and reporting application. Orders are received as events, including line items and persisted on a data store. Queries are also received as events and queried from the data source and results are emitted as events. For the purpose of CQRS pattern, queries have data access patterns different than the order structure.

A sample order data would look like below:

{
  "lastUpdateDate": "2022-11-01T16:26:03Z",
  "status": "orderCreated"
  "orderId": "6002",
  "items" : [
    {
      "itemid": "z3333",
      "quantity": 6
    },
    {
      "itemid": "z4444",
      "quantity": 3
    }
  ]
}

As example queries has data access requirements using order items as primary entity and requires aggregation, data will be copied to a relational database for easier processing. After normalizing the order data, the resulting database schema on the relational database would look like below:

image

On the query side, there are two queries defined. Item Sales Report is just a simplified query that lists all the itemid’s and the total quantities ordered:

SELECT a.itemid, SUM(a.quantity) as totalordered, DATE_FORMAT(MAX(b.orderdate),'%Y-%m-%d') as lastorderdate
FROM orderitems a LEFT JOIN orders b ON a.orderid = b.orderid
GROUP BY a.itemid;

A second query includes the parameter of itemid, and gets all the item sales per month:

SELECT DATE_FORMAT(b.orderdate,'%M %Y') as monthyear, SUM(a.quantity) as monthlyordered
FROM orderitems a LEFT JOIN orders b ON a.orderid = b.orderid
WHERE itemid = <itemid parameter>
GROUP BY year(b.orderdate),month(b.orderdate)
ORDER BY b.orderdate DESC;

The overall architecture of the solution is explained below:

image

Order data is persisted on a DynamoDB table. For report queries, a relational database would provide more flexibility so Aurora Serverless v2 is chosen. In order to capture changes on DynamoDB table, DynamoDB Streams is used. A Lambda function, triggered by events on DynamoDB stream, performs the required transformation on the data and inserts it to Aurora Serverless v2. For simplicity only processing of INSERT events is implemented.

image

Workflow is triggered by EventBridge events. Events are filtered by an EventBridge rule and workflow is triggered when an event with Source property that has a value orderPipeline and DetailType property that has either orderCommand or orderQuery is received.

The first state is a Choice state that routes the event based on DetailType. Commands are processed on a branch that loops over the order items with a Map sate for marshalling line items and persists in the orders table on DynamoDB using PutItem action. A confirmation event is then sent on EventBridge using a PutEvents action.

image

Queries are handled on a different branch. A second Choice state routes the event to corresponding Lambda functions based on the type of the query as specified on the event details. Lambda functions run the queries against Aurora Serverless v2 and the result set is then sent as an event to EventBridge using a PutEvents action.

image

The workflow is implemented as an Express workflow type, as execution time for the requests are small. EventBridge triggers the workflow asynchronously, which guarantees at-least-once workflow execution and the workflow implementation ensures idempotency.

SAM template provisions all the required components, including the VPC needed by Aurora Serverless v2. Database credentials are stored in AWS Secrets Manager, and a AWS CloudFormation Custom Resource is used to trigger a Lambda function that initializes Aurora Serverless v2 database creating required tables and procedures when template is deployed.

Image

image

Testing

You need to send events to EventBridge to trigger the workflow. Easiest way of sending events would be using AWS CLI. Make sure you have installed the latest version of AWS CLI and configure it to use the same region that you have deployed the workflow.

You will need the ARN of EventBridge event bus created to send events. You can get this information from the output of the sam deploy command specified with EventBridgeARN key:

image

or from the CloudFormation stack outputs:

image

You can find the AWS CLI commands to send example command and query below. To use those commands, you need to set an environment variable that holds EventBridge ARN. Use the command for your platform below and replace <EventBridge Event Bus ARN> with ARN of EventBridge you have obtained in the previous step:

For Linux and MacOS:

export AWS_CQRS_EVENTBRIDGE_ARN=<EventBridge Event Bus ARN>

For Windows Command Prompt:

set AWS_CQRS_EVENTBRIDGE_ARN=<EventBridge Event Bus ARN>

After setting the EventBridge ARN, you can run the AWS CLI commands below to send the example events. For a Command query that will insert the Order to DynamoDB you can use the command below:

For Linux and MacOS:

aws events put-events --entries '[{"EventBusName" : "'"$AWS_CQRS_EVENTBRIDGE_ARN"'","Time" : "2022-10-13T19:00:00Z","DetailType": "orderCommand","Source": "orderPipeline","Detail": "{\"command\": \"createOrder\",\"orderId\": \"6102\",\"items\" : [{\"itemid\": \"z3333\",\"quantity\": 6},{\"itemid\": \"z4444\",\"quantity\": 3}]}"}]'

For Windows Command Prompt:

aws events put-events --entries="[{\"EventBusName\" : \"%AWS_CQRS_EVENTBRIDGE_ARN%\",\"Time\" : \"2022-10-13T19:00:00Z\",\"DetailType\": \"orderCommand\",\"Source\": \"orderPipeline\",\"Detail\": \"{\\\"command\\\": \\\"createOrder\\\",\\\"orderId\\\": \\\"6102\\\",\\\"items\\\" : [{\\\"itemid\\\": \\\"z3333\\\",\\\"quantity\\\": 6},{\\\"itemid\\\": \\\"z4444\\\",\\\"quantity\\\": 3}]}\"}]"

To see the execution results, you can access to Step Functions console, click on the workflow name CQRSPatternStepFunction and on the Executions tab you should be able to see an execution. Click on the Name of the execution to see the details of the execution. You can also check the DynamoDB table to confirm the order details sent is inserted.

In order to send an event for Query, you can use the AWS CLI command below.

For Linux and MacOS:

aws events put-events --entries '[{"EventBusName" : "'"$AWS_CQRS_EVENTBRIDGE_ARN"'","Time" : "2022-11-14T19:00:00Z","DetailType": "orderQuery", "Source": "orderPipeline", "Detail": "{\"query\": \"itemsSalesReport\"}"}]'

For Windows Command Prompt:

aws events put-events --entries="[{\"EventBusName\" : \"%AWS_CQRS_EVENTBRIDGE_ARN%\",\"Time\" : \"2022-11-14T19:00:00Z\",\"DetailType\": \"orderQuery\", \"Source\": \"orderPipeline\", \"Detail\": \"{\\\"query\\\": \\\"itemsSalesReport\\\"}\"}]"

You can see the execution details on to Step Functions console. To see the results of the Query, you can check the Input section of the Send ItemSalesReport Query Results step in the execution details.

image

You can find the sample events in the events folder of the solution as well.

Cleanup

  1. Delete the stack
    sam delete
  2. Confirm the stack has been deleted
    aws cloudformation list-stacks --query "StackSummaries[?contains(StackName,'STACK_NAME')].StackStatus"

Please note that CloudFormation stack deletion may take some time due to cleanup of VPC connected Lambda functions.


Copyright 2022 Amazon.com, Inc. or its affiliates. All Rights Reserved.

SPDX-License-Identifier: MIT-0