Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to add custom EventSource and lambda triggers via amplify add function, Kinesis support in analytics category #2463

Merged
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
Show all changes
56 commits
Select commit Hold shift + click to select a range
f383c94
feat(amplify-category-function): trig base logic
ambientlight Sep 30, 2019
75ef928
fix(amplify-category-function): lint-fix
ambientlight Sep 30, 2019
c9da7da
feat(graphql-dynamodb-transformer): liftnested out
ambientlight Sep 30, 2019
bfe7a44
feat(amplify-category-function): api/storage trigs
ambientlight Sep 30, 2019
a1ba0ed
fix(amplify-category-function): remove sqs from lambda triggers
ambientlight Oct 19, 2019
31db0aa
feat(cli): optional custom question in service-select-prompt
ambientlight Oct 20, 2019
16294f5
fix(cli): fix param rename in comments in copy-batch
ambientlight Oct 21, 2019
9432e3d
feat(amplify-category-analytics): barebone kinesis
ambientlight Oct 21, 2019
14c95a6
feat(amplify-category-analytics): kinesis cognito policies
ambientlight Oct 21, 2019
9516fe6
feat(amplify-category-analytics): kinesis update support
ambientlight Oct 22, 2019
c462608
feat(amplify-category-function): analytics kinesis trigger support
ambientlight Oct 22, 2019
ef4d0dd
Merge branch 'master' into lambda-custom-event-source
ambientlight Oct 23, 2019
e116ca9
fix: lint fix
ambientlight Oct 23, 2019
bc5385d
fix(amplify-category-function): fix non appsync resources, redund prompt
ambientlight Oct 23, 2019
5aee354
fix(amplify-category-analytics): getIAMPolicies for kinesis
ambientlight Oct 23, 2019
0fa8c87
fix(amplify-category-analytics): create flow filter condition in kinesis
ambientlight Oct 23, 2019
c2e5027
feat(amplify-provider-awscloudformation): nested stack outputs in meta
ambientlight Oct 29, 2019
44d596b
fix(amplify-provider-awscloudformation): logicalId nested, nested forall
ambientlight Oct 30, 2019
f1d7c82
fix(graphql-dynamodb-transformer): remove exposed nested outputs in root
ambientlight Nov 2, 2019
d1bb7a1
feat(amplify-provider-awscloudformation): exports support
ambientlight Nov 2, 2019
047fece
fix(amplify-provider-awscloudformation): depends.exports fixes
ambientlight Nov 2, 2019
6415619
feat(amplify-category-function): dynamoDB lambda trigger with exports
ambientlight Nov 2, 2019
48e5a2c
feat(amplify-category-analytics): check auth rules in kinesis
ambientlight Nov 3, 2019
fbbb939
fix(amplify-provider-awscloudformation): cleaner dependsOn exports
ambientlight Nov 3, 2019
21af03b
fix(amplify-category-function): importValue directly for multienv
ambientlight Nov 3, 2019
5579a9c
feat(amplify-category-analytics): kinesis tweaks, console support
ambientlight Nov 18, 2019
2b628e4
feat(amplify-category-analytics): lock single kinesis resource
ambientlight Nov 18, 2019
548293d
feat(amplify-category-function): collect directives, multi-DDB support
ambientlight Nov 18, 2019
c9a4c70
feat(amplify-category-function): lint fix, missed assign, tran-core dep
ambientlight Nov 18, 2019
41ebcb6
feat(amplify-category-api): model-scoped getIAMPolicies
ambientlight Nov 18, 2019
f8788e0
feat(amplify-provider-awscloudformation): remove exports aggregation
ambientlight Nov 18, 2019
e86b49f
fix(amplify-provider-awscloudformation): fix leftover resourceOutput
ambientlight Dec 19, 2019
fee6dcf
Merge branch 'master' into lambda-custom-event-source
ambientlight Jan 9, 2020
7d53c9e
Revert "feat(amplify-category-api): model-scoped getIAMPolicies"
ambientlight Jan 19, 2020
5758a5b
feat(amplify-category-function): trigger dep GraphQLAPIEndpointOutput
ambientlight Jan 19, 2020
b3f5913
fix(amplify-category-api): fix undefined TransformPackage
ambientlight Jan 19, 2020
0f1e2c7
feat(amplify-category-function): appsync @model tables in storage flow
ambientlight Jan 20, 2020
c30dc36
Merge branch 'master' into lambda-custom-event-source
ambientlight Jan 20, 2020
5b3660c
fix(amplify-category-function): fn::sub reference, storage flow
ambientlight Jan 29, 2020
e3ec6ae
Merge branch 'lambda-custom-event-source' of https://github.com/ambie…
ambientlight Jan 29, 2020
d9903c2
Merge branch 'master' into lambda-custom-event-source
ambientlight Jan 29, 2020
b0af83b
feat(amplify-category-function): model permissions lambda envs
ambientlight Jan 29, 2020
452f46f
fix(amplify-category-analytics): typos, use latest api for service meta
ambientlight Feb 1, 2020
77b4b4c
fix(amplify-category-function): remove scripts, update deps
ambientlight Feb 1, 2020
07a2165
fix(amplify-category-function): guard absent res category in meta
ambientlight Feb 6, 2020
e904ce7
fix(amplify-category-function): package.json update
ambientlight Feb 6, 2020
c7226e7
Merge branch 'master' into lambda-custom-event-source
ambientlight Feb 6, 2020
b58bfc8
fix(amplify-category-function): update flow CF params assume Ref envvar
ambientlight Feb 7, 2020
f6d474b
fix(amplify-category-function): triggerEventSourceMappings as array
ambientlight Feb 11, 2020
30744aa
fix(amplify-category-analytics): extend getIAMPolicies for kinesis
ambientlight Feb 17, 2020
1218ff8
fix(amplify-e2e-tests): account for analytics flow update
ambientlight Feb 20, 2020
fdcba7d
feat(amplify-e2e-tests): sample lambda trigger e2e test
ambientlight Feb 23, 2020
1f79657
feat(amplify-e2e-tests): kinesis trigger, name validation, template
ambientlight Feb 23, 2020
68a0b7e
fix(amplify-category-function): fix inf loop in unexisting ddb, tests
ambientlight Feb 24, 2020
3dc454b
feat(amplify-e2e-tests): additional perm tests, fix ddb arn derivation
ambientlight Feb 25, 2020
3fc7e57
Merge https://github.com/aws-amplify/amplify-cli into lambda-custom-e…
ambientlight Feb 25, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -173,6 +173,39 @@
}
}
<% } %>
<% if (props.triggerEventSourceMapping) { %>
,"LambdaTriggerPolicy": {
"DependsOn": ["LambdaExecutionRole"],
"Type": "AWS::IAM::Policy",
"Properties": {
"PolicyName": "amplify-lambda-execution-policy",
"Roles": [{ "Ref": "LambdaExecutionRole" }],
"PolicyDocument": {
"Version": "2012-10-17",
"Statement": <%- JSON.stringify(props.triggerEventSourceMapping.triggerPolicies) %>
}
}
}
,"LambdaEventSourceMapping": {
"Type": "AWS::Lambda::EventSourceMapping",
"DependsOn": [
"LambdaTriggerPolicy",
"LambdaExecutionRole"
],
"Properties": {
"BatchSize": <%= props.triggerEventSourceMapping.batchSize %>,
"Enabled": true,
"EventSourceArn": <%- JSON.stringify(props.triggerEventSourceMapping.eventSourceArn) %>,
"FunctionName": {
"Fn::GetAtt": [
"LambdaFunction",
"Arn"
]
},
"StartingPosition": "<%= props.triggerEventSourceMapping.startingPosition %>"
}
}
<% } %>
},
"Outputs": {
"Name": {
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
exports.handler = function (event, context) { //eslint-disable-line
console.log(JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
context.done(null, 'Successfully processed DynamoDB record'); // SUCCESS with message
};

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not use the callback model since this isn't an async function? https://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

absolutely, actually originally just copied this template from

exports.handler = function(event, context) {
//eslint-disable-line
console.log(JSON.stringify(event, null, 2));
event.Records.forEach(record => {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
context.done(null, 'Successfully processed DynamoDB record'); // SUCCESS with message
};

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not suggesting you change it. Just curious. If that's the template used then follow that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is a default lambda code template for @model trigger, thanks for bringing this up, it makes sense refining both templates. I mean, for now storage category trigger does not utilize function-category implementation, while actually having a very same logic for triggers, if this gets accepted, will be great to have another PR updating that utilizes function-category implementation in storage flow.

Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,25 @@ function copyCfnTemplate(context, category, options, cfnFilename) {
},
]);
break;
case 'lambdaTrigger':
copyJobs.push(...[
{
dir: pluginDir,
template: `function-template-dir/${options.triggerEventSourceMapping.functionTemplateName}`,
target: `${targetDir}/${category}/${options.resourceName}/src/index.js`,
},
{
dir: pluginDir,
template: 'function-template-dir/event.json',
target: `${targetDir}/${category}/${options.resourceName}/src/event.json`,
},
{
dir: pluginDir,
template: 'function-template-dir/package.json.ejs',
target: `${targetDir}/${category}/${options.resourceName}/src/package.json`,
},
]);
break;
default:
copyJobs.push(...[
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,13 @@ async function serviceWalkthrough(context, defaultValuesFilename, serviceMetadat
});
}
allDefaultValues.dependsOn = dependsOn;
} else if (answers.functionTemplate === 'lambdaTrigger') {
const eventSourceAnswers = await askEventSourceQuestions(context, inputs);
Object.assign(allDefaultValues, eventSourceAnswers);
if (eventSourceAnswers.dependsOn) {
dependsOn.push(...eventSourceAnswers.dependsOn);
}
allDefaultValues.dependsOn = dependsOn;
}

let topLevelComment;
Expand Down Expand Up @@ -561,15 +568,269 @@ async function getTableParameters(context, dynamoAnswers) {
return parameters;
}

async function askDynamoDBQuestions(context, inputs) {
async function askEventSourceQuestions(context, inputs) {
const eventSourceTypeInput = inputs.find(input => input.key === 'eventSourceType');
if (eventSourceTypeInput === undefined) {
throw Error('Unable to find eventSourceType question data. (this is likely an amplify error, please report)');
}

const selectEventSourceQuestion = {
type: eventSourceTypeInput.type,
name: eventSourceTypeInput.key,
message: eventSourceTypeInput.question,
choices: eventSourceTypeInput.options,
};

const eventSourceTypeAnswer = await inquirer.prompt([selectEventSourceQuestion]);

let arnInput;
let arnQuestion;
let arnAnswer;
let eventSourceArn;
let dynamoDBStreamKindInput;
let dynamoDbStreamKindQuestion;
let dynamoDbStreamKindAnswer;
let dynamoDbStreamKind;
let dynamoDBCategoryStorageRes;
let dynamoDBCategoryStorageStreamArnRef;
switch (eventSourceTypeAnswer[eventSourceTypeInput.key]) {
case 'kinesis':
arnInput = inputs.find(input => input.key === 'amazonKinesisStreamARN');
if (arnInput === undefined) {
throw Error('Unable to find amazonKinesisStreamARN question data. (this is likely an amplify error, please report)');
}
arnQuestion = {
name: arnInput.key,
message: arnInput.question,
validate: context.amplify.inputValidation(arnInput),
};
arnAnswer = await inquirer.prompt([arnQuestion]);
eventSourceArn = arnAnswer[arnInput.key];
return {
triggerEventSourceMapping: {
batchSize: 100,
startingPosition: 'LATEST',
eventSourceArn,
functionTemplateName: 'trigger-custom.js',
triggerPolicies: [{
Effect: 'Allow',
Action: [
'kinesis:DescribeStream',
'kinesis:DescribeStreamSummary',
'kinesis:GetRecords',
'kinesis:GetShardIterator',
'kinesis:ListShards',
'kinesis:ListStreams',
'kinesis:SubscribeToShard',
],
Resource: eventSourceArn,
}],
},
};

case 'dynamoDB':
dynamoDBStreamKindInput = inputs.find(input => input.key === 'dynamoDbStreamKind');
if (dynamoDBStreamKindInput === undefined) {
throw Error('Unable to find dynamoDBStreamKindInput question data. (this is likely an amplify error, please report)');
}
dynamoDbStreamKindQuestion = {
type: dynamoDBStreamKindInput.type,
name: dynamoDBStreamKindInput.key,
message: dynamoDBStreamKindInput.question,
choices: dynamoDBStreamKindInput.options,
};
dynamoDbStreamKindAnswer = await inquirer.prompt([dynamoDbStreamKindQuestion]);
dynamoDbStreamKind = dynamoDbStreamKindAnswer[dynamoDBStreamKindInput.key];
switch (dynamoDbStreamKind) {
case 'dynamoDbStreamRawARN':
arnInput = inputs.find(input => input.key === 'dynamoDbARN');
if (arnInput === undefined) {
throw Error('Unable to find dynamoDbARN question data. (this is likely an amplify error, please report)');
}
arnQuestion = {
name: arnInput.key,
message: arnInput.question,
validate: context.amplify.inputValidation(arnInput),
};
arnAnswer = await inquirer.prompt([arnQuestion]);
eventSourceArn = arnAnswer[arnInput.key];
return {
triggerEventSourceMapping: {
batchSize: 100,
startingPosition: 'LATEST',
eventSourceArn,
functionTemplateName: 'trigger-dynamodb.js',
triggerPolicies: [{
Effect: 'Allow',
Action: [
'dynamodb:DescribeStream',
'dynamodb:GetRecords',
'dynamodb:GetShardIterator',
'dynamodb:ListStreams',
],
Resource: eventSourceArn,
}],
},
};
case 'graphqlModelTable':
return await askAPICategoryDynamoDBQuestions(context, inputs);
case 'storageDynamoDBTable':
dynamoDBCategoryStorageRes = await askDynamoDBQuestions(context, inputs, true);
dynamoDBCategoryStorageStreamArnRef = {
Ref: `storage${dynamoDBCategoryStorageRes.resourceName}StreamArn`,
};

return {
triggerEventSourceMapping: {
batchSize: 100,
startingPosition: 'LATEST',
eventSourceArn: dynamoDBCategoryStorageStreamArnRef,
functionTemplateName: 'trigger-dynamodb.js',
triggerPolicies: [{
Effect: 'Allow',
Action: [
'dynamodb:DescribeStream',
'dynamodb:GetRecords',
'dynamodb:GetShardIterator',
'dynamodb:ListStreams',
],
Resource: dynamoDBCategoryStorageStreamArnRef,
}],
},
dependsOn: [{
category: 'storage',
resourceName: dynamoDBCategoryStorageRes.resourceName,
attributes: ['StreamArn'],
}],
};
default:
return {};
}
default:
context.print.error('Unrecognized option selected. (this is likely an amplify error, please report)');
return {};
}
}

async function askAPICategoryDynamoDBQuestions(context, inputs) {
const outputs = context.amplify.getResourceOutputs().outputsByCategory;
if (!('api' in outputs) || Object.keys(outputs.api).length === 0) {
throw Error('No resources have been configured in API category');
}

// let resourceNameInput, resourceNameQuestion, resourceNameAnswer, resourceName;
const apiOutput = outputs.api;
const resourceNames = Object.keys(apiOutput);
const resourceNameInput = inputs.find(input => input.key === 'dynamoDbAPIResourceName');
if (resourceNameInput === undefined) {
throw Error('Unable to find dynamoDbAPIResourceName question data. (this is likely an amplify error, please report)');
}

const resourceNameQuestion = {
type: resourceNameInput.type,
name: resourceNameInput.key,
message: resourceNameInput.question,
choices: resourceNames,
};

const resourceNameAnswer = await inquirer.prompt([resourceNameQuestion]);
const resourceName = resourceNameAnswer[resourceNameInput.key];
const resourceOutput = apiOutput[resourceName];

const tableInfos = Object.keys(resourceOutput)
.map(outputName => outputName.match(/^NGetAtt(.*)(TableName|DataSourceName|TableStreamArn)$/))
.filter(match => match)
.map(([, tableName, outputType]) => ({
tableName,
outputType,
value: resourceOutput[`NGetAtt${tableName}${outputType}`],
}))
.reduce((infos, parsedOutput) => {
let partial;
switch (parsedOutput.outputType) {
case 'TableName':
partial = { name: parsedOutput.value };
break;
case 'DatasourceName':
partial = { datasourceName: parsedOutput.value };
break;
case 'TableStreamArn':
partial = { streamArn: parsedOutput.value };
break;
default:
partial = {};
}

return ({
...infos,
[parsedOutput.tableName]: {
...infos[parsedOutput.tableName] || {},
...partial,
},
});
}, {});

const modelNameInput = inputs.find(input => input.key === 'graphqlAPIModelName');
if (modelNameInput === undefined) {
throw Error('Unable to find graphqlAPIModelName question data. (this is likely an amplify error, please report)');
}
if (Object.keys(tableInfos).length === 0) {
throw Error('Unable to find graphql model info.');
}

const modelNameQuestion = {
type: modelNameInput.type,
name: modelNameInput.key,
message: modelNameInput.question,
choices: Object.keys(tableInfos),
};
const modelNameAnswer = await inquirer.prompt([modelNameQuestion]);
const modelName = modelNameAnswer[modelNameInput.key];
const tableInfo = tableInfos[modelName];
if (!(('streamArn') in tableInfo)) {
throw Error(`Unable to find associated streamArn for ${tableInfo} model dynamoDb table.`);
}

const streamArnParamRef = {
Ref: `api${resourceName}NGetAtt${modelName}TableStreamArn`,
};

return {
triggerEventSourceMapping: {
batchSize: 100,
startingPosition: 'LATEST',
eventSourceArn: streamArnParamRef,
functionTemplateName: 'trigger-dynamodb.js',
triggerPolicies: [{
Effect: 'Allow',
Action: [
'dynamodb:DescribeStream',
'dynamodb:GetRecords',
'dynamodb:GetShardIterator',
'dynamodb:ListStreams',
],
Resource: streamArnParamRef,
}],
},
dependsOn: [{
category: 'api',
resourceName,
attributes: [`NGetAtt${modelName}TableStreamArn`],
}],
};
}

async function askDynamoDBQuestions(context, inputs, currentProjectOnly = false) {
const dynamoDbTypeQuestion = {
type: inputs[5].type,
name: inputs[5].key,
message: inputs[5].question,
choices: inputs[5].options,
};
while (true) { //eslint-disable-line
const dynamoDbTypeAnswer = await inquirer.prompt([dynamoDbTypeQuestion]);
const dynamoDbTypeAnswer = currentProjectOnly
? { [inputs[5].key]: 'currentProject' }
: (await inquirer.prompt([dynamoDbTypeQuestion]));
switch (dynamoDbTypeAnswer[inputs[5].key]) {
case 'currentProject': {
const storageResources = context.amplify.getProjectDetails().amplifyMeta.storage;
Expand Down
Loading