Skip to content

Commit

Permalink
Merge branch 'development'
Browse files Browse the repository at this point in the history
  • Loading branch information
ianwow committed Dec 9, 2020
2 parents 74a2a51 + d57b457 commit 3aa70ca
Show file tree
Hide file tree
Showing 10 changed files with 185 additions and 74 deletions.
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
node_modules
package
dist
package-lock.json
97 changes: 91 additions & 6 deletions IMPLEMENTATION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,9 @@ Join our Gitter chat at [https://gitter.im/awslabs/aws-media-insights-engine](ht

[5. API Documentation](#5-api-documentation)

[6. Glossary](#6-glossary)
[6. Troubleshooting](#5-troubleshooting)

[7. Glossary](#6-glossary)


# 1. Overview
Expand All @@ -56,17 +58,17 @@ If you already have MIE deployed in your account, then use one of the following

Region| Launch
------|-----
US East (N. Virginia) | [![Launch in us-east-1](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-east-1.s3.amazonaws.com/content-analysis-solution/v1.0.0/cf/aws-content-analysis.template)
US West (Oregon) | [![Launch in us-west-2](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-west-2.s3.amazonaws.com/content-analysis-solution/v1.0.0/cf/aws-content-analysis.template)
US East (N. Virginia) | [![Launch in us-east-1](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-east-1.s3.amazonaws.com/content-analysis-solution/v2.0.0/cf/aws-content-analysis.template)
US West (Oregon) | [![Launch in us-west-2](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-west-2.s3.amazonaws.com/content-analysis-solution/v2.0.0/cf/aws-content-analysis.template)

#### *Option 2:* Install back-end + front-end

If you do not have MIE deployed in your account, then use one of the following buttons to deploy both MIE and the Media Insights front-end application. This will deploy a prebuilt version of the most recent MIE release.

Region| Launch
------|-----
US East (N. Virginia) | [![Launch in us-east-1](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-east-1.s3.amazonaws.com/content-analysis-solution/v1.0.0/cf/aws-content-analysis-deploy-mie.template)
US West (Oregon) | [![Launch in us-west-2](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-west-2.s3.amazonaws.com/content-analysis-solution/v1.0.0/cf/aws-content-analysis-deploy-mie.template)
US East (N. Virginia) | [![Launch in us-east-1](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-east-1.s3.amazonaws.com/content-analysis-solution/v2.0.0/cf/aws-content-analysis-deploy-mie.template)
US West (Oregon) | [![Launch in us-west-2](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-west-2.s3.amazonaws.com/content-analysis-solution/v2.0.0/cf/aws-content-analysis-deploy-mie.template)

# 3. Security

Expand Down Expand Up @@ -900,7 +902,90 @@ Now you can use Kibana to validate that your operator's data is present in Elast
* 404: Not found
* 500: Internal server error

# 6. Glossary
# 6. Troubleshooting

## How to enable AWS X-Ray request tracing for MIE

AWS X-Ray traces requests through the AWS platform. It is especially useful for performance debugging, but also helps with other types of debugging by making it easy to follow what happened with a request end to end across AWS services, even when the request triggered execution across multiple AWS accounts.

The AWS X-Ray service has a perpetual free tier. When free tier limits are exceeded X-Ray tracing incurs charges as outlined by the [X-Ray pricing](https://aws.amazon.com/xray/pricing/) page.


### Enable tracing from Lambda entry points

By default, tracing for MIE is disabled. You can enable AWS X-Ray tracing for MIE requests by updating the MIE stack with the **EnableXrayTrace** CloudFormation parameter to `true` . When tracing is enabled, all supported services that are invoked for the request will be traced starting from MIE Lambda entry points. These entry point Lambdas are as follows:

* WorkflowAPIHandler
* WorkflowCustomResource
* WorkflowScheduler
* DataplaneAPIHandler

### Enable tracing from API Gateway entry points

Additionally, you can enable tracing for API Gateway requests in the AWS Console by checking the *Enable tracing* option for the deployed API Gateway stages for both the Workflow API and the Dataplane API. See the [AWS console documentation](https://docs.aws.amazon.com/xray/latest/devguide/xray-services-apigateway.html) for more info.

### Developing custom tracing in MIE lambda functions

MIE Lambdas import the [X-Ray Python packages](https://docs.aws.amazon.com/xray/latest/devguide/xray-sdk-python.html) and patch any supported libraries at runtime. MIE Lambdas are ready for future instrumentation by developers using the X-Ray Python packages.

The MIE Lambda Layer contains all the packages depedencies needed to support X-Ray, so they are available to any new Lambdas that use the Layer.

## MIE workflow error handling

When you create MIE workflows, MIE automatically creates state machines for you with built-in error handling.

There are two levels of error handling in MIE workflows state machines: Operator error handling and Workflow error handling.

### Operator error handling

#### Operator lambda code

Operator lambdas can use the `MasExecutionError` property from the `MediaInsightsEngineLambdaHelper` python library to consistently handle errors that occur within the lambda code of MIE Operators.

The following is an example of lambda function error handling used in the **ENTITIES** (Comprehend) operator:

``` python
from MediaInsightsEngineLambdaHelper import MasExecutionError

try:
...
except Exception as e:
operator_object.update_workflow_status("Error")
operator_object.add_workflow_metadata(comprehend_entity_job_id=job_id, comprehend_error="comprehend returned as failed: {e}".format(e=response["EntitiesDetectionJobPropertiesList"][0]["Message"]))
raise MasExecutionError(operator_object.return_output_object())
```

This code updates the outputs of the operator within the workflow_execution results with the error status, specific error information for this failure then raises an exception. The exception will trigger the `Catch` and `Retry` error handling within the state machine (see next section).

#### Operator state machine ASL error handling

Operators use `Catch` and `Retry` to handle errors that occur in the steps of the operator state machine tasks. If a step returns an error, the operator is retried. If retry attempts fail, then the **OperatorFailed** lambda resource is invoked to handle the error by making sure the workflow_execution object contains the error status, specific information about the failure and the workflow execution error status is propagated to the control plane. The following is an example of the `Catch` and `Retry` states using Amazon States Language (ASL) for MIE state machine error handling:

``` json
{
...
"Retry": [ {
"ErrorEquals": ["Lambda.ServiceException", "Lambda.AWSLambdaException", "Lambda.SdkClientException", "Lambda.Unknown", "MasExecutionError"],
"IntervalSeconds": 2,
"MaxAttempts": 2,
"BackoffRate": 2
}],
"Catch": [{
"ErrorEquals": ["States.ALL"],
"Next": "<OPERATION_NAME> Failed (<STAGE_NAME>)",
"ResultPath": "$.Outputs"
}]
...

```

#### Workflow state machine error handling

If an error occurs in the Step Function service that causes the state machine execution for an MIE workflow to be terminated immediately, then the `Catch` and `Retry` and **OperatorFailed** lambda will not be able to handle the error. These types of errors can occur in a number of circumstances. For example, when the Step Function history limit is exceeded or the execution is Stopped (aborted) from the AWS console. Failure to handle these errors will the the workflow in a perpetually `Started` status in the MIE control plane.

The **WorkflowErrorHandlerLambda:** lambda resource is triggered when the Step Functions service emits `Step Functions Execution Status Change` EventBridge events that have an error status (`FAILED, TIMED_OUT, ABORTED`). The error handler propagates the error to the MIE control plane if the workflow is not already completed.

# 7. Glossary

## Workflow API
Triggers the execution of a workflow. Also triggers create, update and delete workflows and operators. Monitors the status of workflows.
Expand Down
14 changes: 12 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,21 @@ The following Cloudformation templates will deploy the Media Insights front-end

Region| Launch
------|-----
US East (N. Virginia) | [![Launch in us-east-1](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-east-1.s3.amazonaws.com/content-analysis-solution/v1.0.0/cf/aws-content-analysis-deploy-mie.template)
US West (Oregon) | [![Launch in us-west-2](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-west-2.s3.amazonaws.com/content-analysis-solution/v1.0.0/cf/aws-content-analysis-deploy-mie.template)
US East (N. Virginia) | [![Launch in us-east-1](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-east-1.s3.amazonaws.com/content-analysis-solution/v2.0.0/cf/aws-content-analysis-deploy-mie.template)
US West (Oregon) | [![Launch in us-west-2](doc/images/launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=mie&templateURL=https://rodeolabz-us-west-2.s3.amazonaws.com/content-analysis-solution/v2.0.0/cf/aws-content-analysis-deploy-mie.template)

For more installation options, see the [Implementation Guide](IMPLEMENTATION_GUIDE.md).

# COST

You are responsible for the cost of the AWS services used while running this application. The primary cost factors are from using Amazon Rekognition and Amazon Elasticsearch Service (Amazon ES). Videos cost about $0.50 per minute to process, but can vary between $0.10 per minute and $0.60 per minute depending on the video content. If you disable Amazon Rekognition in your workflow configuration, then video costs can decrease to approximately $0.04 per minute. Data storage and Amazon ES will cost approximately ***$10.00 per day*** regardless of the quantity or type of video content.

After a video is uploaded into the solution, the costs for processing are a one-time expense. However, data storage costs occur daily, as shown in the following screenshot from AWS Cost Explorer.

<img src="doc/images/cost.png" width=600>

For more information about cost, see the pricing webpage for each AWS service you will be using in this solution. If you need to process a large volume of videos, we recommend that you contact your AWS account representative for at-scale pricing.

# Analysis Workflow

After uploading a video or image in the GUI, the application runs a workflow in MIE that extracts insights using a variety of media analysis services on AWS and stores them in a search engine for easy exploration. The following flow diagram illustrates this workflow:
Expand Down
51 changes: 24 additions & 27 deletions cloudformation/aws-content-analysis-deploy-mie.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,6 @@ AWSTemplateFormatVersion: "2010-09-09"
Description: AWS Content Analysis - Deploys the AWS Content Analysis Application and the required Media Insights Engine Framework

Parameters:
MieVersion:
Type: String
Description: The version of the Media Insights Engine framework to deploy
Default: "v1.0.0"
AllowedValues: ["v1.0.0"]
AdminEmail:
Description: Email address of the Content Analysis Administrator
Type: String
Expand All @@ -22,15 +17,11 @@ Parameters:
- "c4.xlarge.elasticsearch"
- "r4.large.elasticsearch"
- "r4.xlarge.elasticsearch"
DeployTestWorkflow:
Description: Deploy test workflow which contains operator, stage, and workflow stubs for integration testing
Type: String
Default: No
AllowedValues:
- Yes
- No

Mappings:
MediaInsightsEngine:
Release:
Version: "v2.0.0"
ContentAnalysisApp:
SourceCode:
S3Bucket: "%%BUCKET_NAME%%"
Expand All @@ -41,7 +32,7 @@ Mappings:
Resources:
# Deploy MIE Framework

MediaInsightsFrameworkStack:
MieStack:
Type: "AWS::CloudFormation::Stack"
Properties:
TemplateURL: !Join
Expand All @@ -53,11 +44,17 @@ Resources:
- !Ref AWS::Region
- ".amazonaws.com/"
- "media_insights_engine/"
- !Ref MieVersion
- !FindInMap
- MediaInsightsEngine
- Release
- Version
- "/cf"
- "/media-insights-stack.template"
Parameters:
DeployTestWorkflow: !Ref DeployTestWorkflow
DeployAnalyticsPipeline: Yes
DeployTestResources: No
MaxConcurrentWorkflows: 5
EnableXrayTrace: Yes

# Deploy Elasticsearch

Expand All @@ -80,8 +77,8 @@ Resources:
- TemplateKeyPrefix
- "/aws-content-analysis-elasticsearch.template"
Parameters:
AnalyticsStreamArn: !GetAtt MediaInsightsFrameworkStack.Outputs.AnalyticsStreamArn
MieDataplaneBucket: !GetAtt MediaInsightsFrameworkStack.Outputs.DataplaneBucket
AnalyticsStreamArn: !GetAtt MieStack.Outputs.AnalyticsStreamArn
MieDataplaneBucket: !GetAtt MieStack.Outputs.DataplaneBucket

# Deploy Auth stack

Expand All @@ -105,10 +102,10 @@ Resources:
- "/aws-content-analysis-auth.template"
Parameters:
AdminEmail: !Ref AdminEmail
WorkflowAPIRestID: !GetAtt MediaInsightsFrameworkStack.Outputs.WorkflowApiRestID
DataplaneAPIRestID: !GetAtt MediaInsightsFrameworkStack.Outputs.DataplaneApiRestID
WorkflowAPIRestID: !GetAtt MieStack.Outputs.WorkflowApiRestID
DataplaneAPIRestID: !GetAtt MieStack.Outputs.DataplaneApiRestID
ElasticDomainArn: !GetAtt ContentAnalysisElasticsearchStack.Outputs.DomainArn
DataplaneBucket: !GetAtt MediaInsightsFrameworkStack.Outputs.DataplaneBucket
DataplaneBucket: !GetAtt MieStack.Outputs.DataplaneBucket

ContentAnalysisWebStack:
Type: "AWS::CloudFormation::Stack"
Expand All @@ -130,10 +127,10 @@ Resources:
- TemplateKeyPrefix
- "/aws-content-analysis-web.template"
Parameters:
DataplaneEndpoint: !GetAtt MediaInsightsFrameworkStack.Outputs.DataplaneApiEndpoint
WorkflowEndpoint: !GetAtt MediaInsightsFrameworkStack.Outputs.WorkflowApiEndpoint
DataplaneEndpoint: !GetAtt MieStack.Outputs.DataplaneApiEndpoint
WorkflowEndpoint: !GetAtt MieStack.Outputs.WorkflowApiEndpoint
ElasticEndpoint: !GetAtt ContentAnalysisElasticsearchStack.Outputs.ElasticEndpoint
DataplaneBucket: !GetAtt MediaInsightsFrameworkStack.Outputs.DataplaneBucket
DataplaneBucket: !GetAtt MieStack.Outputs.DataplaneBucket
UserPoolId: !GetAtt ContentAnalysisAuthStack.Outputs.UserPoolId
IdentityPoolId: !GetAtt ContentAnalysisAuthStack.Outputs.IdentityPoolId
PoolClientId: !GetAtt ContentAnalysisAuthStack.Outputs.UserPoolClientId
Expand Down Expand Up @@ -161,11 +158,11 @@ Resources:
Parameters:
WorkflowCustomResourceArn:
Fn::GetAtt:
- MediaInsightsFrameworkStack
- MieStack
- Outputs.WorkflowCustomResourceArn
OperatorLibraryStack:
Fn::GetAtt:
- MediaInsightsFrameworkStack
- MieStack
- Outputs.OperatorLibraryStack

# Deploy image workflow
Expand All @@ -191,11 +188,11 @@ Resources:
Parameters:
WorkflowCustomResourceArn:
Fn::GetAtt:
- MediaInsightsFrameworkStack
- MieStack
- Outputs.WorkflowCustomResourceArn
OperatorLibraryStack:
Fn::GetAtt:
- MediaInsightsFrameworkStack
- MieStack
- Outputs.OperatorLibraryStack

# Custom resources
Expand Down
61 changes: 49 additions & 12 deletions cloudformation/aws-content-analysis-web.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,18 +27,60 @@ Mappings:

Resources:
# Web application resources
# WebsiteBucketNameFunction - derive a name for the website bucket based on the lower case stack name.
WebsiteBucketNameFunction:
Type: AWS::Lambda::Function
Properties:
Code:
ZipFile: |
import string
import random
import cfnresponse
def handler(event, context):
stack_name = event['StackId'].split('/')[1].split('-Uuid')[0]
response_data = {'Data': stack_name.lower() + '-website'}
cfnresponse.send(event, context, cfnresponse.SUCCESS, response_data, "CustomResourcePhysicalID")
Handler: index.handler
Runtime: python3.8
Role: !GetAtt WebsiteBucketNameExecutionRole.Arn
WebsiteBucketNameFunctionPermissions:
Type: AWS::Lambda::Permission
Properties:
Action: 'lambda:InvokeFunction'
FunctionName: !GetAtt WebsiteBucketNameFunction.Arn
Principal: 'cloudformation.amazonaws.com'
WebsiteBucketNameExecutionRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action:
- sts:AssumeRole
Path: /
Policies:
- PolicyName: root
PolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Action: ['logs:*']
Resource: 'arn:aws:logs:*:*:*'
GetWebsiteBucketName:
Type: Custom::CustomResource
Properties:
ServiceToken: !GetAtt WebsiteBucketNameFunction.Arn

ContentAnalysisWebsiteBucket:
Type: AWS::S3::Bucket
DeletionPolicy: Retain
Properties:
AccessControl: LogDeliveryWrite
BucketName:
Fn::Transform:
Name: 'String'
Parameters:
InputString: !Sub "${AWS::StackName}-website"
Operation: Lower
BucketName: !GetAtt GetWebsiteBucketName.Data
BucketEncryption:
ServerSideEncryptionConfiguration:
- ServerSideEncryptionByDefault:
Expand All @@ -47,12 +89,7 @@ Resources:
IndexDocument: "index.html"
ErrorDocument: "index.html"
LoggingConfiguration:
DestinationBucketName:
Fn::Transform:
Name: 'String'
Parameters:
InputString: !Sub "${AWS::StackName}-website"
Operation: Lower
DestinationBucketName: !GetAtt GetWebsiteBucketName.Data
LogFilePrefix: "access_logs/"
LifecycleConfiguration:
Rules:
Expand Down
Loading

0 comments on commit 3aa70ca

Please sign in to comment.