-
Notifications
You must be signed in to change notification settings - Fork 819
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ability to add DynamoDB triggers via amplify add function #997
Comments
Indeed, came across that problem too. Current workaround is to trigger a GraphQL mutation from Lambda, but that's a bit weird. @mikeparisstuff @kaustavghosh06 @UnleashedMind
|
I would second this request.. the ability to configure a lambda to trigger based on Dynamo or Cognito or even SQS. |
Steps to reproduce:
Have configured API backed by DynamoDB. |
Also have this issue. It seems like creating a Lambda function doesn't recognize the DynamoDB backend created when using |
Update: Until that is the case or until you can choose your existing DynamoDB in the Amplify prompts, here is a manual workaround: Add you parameters (note AVOID COMMITTING THESE TO GIT, or use fake): "Parameters": {
"env": {
"Type": "String"
},
"storagetododynamoName": {
"Type": "String",
"Default": "<your-db-name>"
},
"storagetododynamoArn": {
"Type": "String",
"Default": "<your-db-arn>"
}
}, Append the policies manually which would be automatically generated for you: "lambdaexecutionpolicy": {
"DependsOn": ["LambdaExecutionRole"],
"Type": "AWS::IAM::Policy",
"Properties": {
"PolicyName": "lambda-execution-policy",
"Roles": [{ "Ref": "LambdaExecutionRole" }],
"PolicyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": {
"Fn::Sub": [
"arn:aws:logs:${region}:${account}:log-group:/aws/lambda/${lambda}:log-stream:*",
{
"region": { "Ref": "AWS::Region" },
"account": { "Ref": "AWS::AccountId" },
"lambda": { "Ref": "LambdaFunction" }
}
]
}
},
{
"Effect": "Allow",
"Action": [
"dynamodb:GetItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:DeleteItem"
],
"Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
}
]
}
}
},
"AmplifyResourcesPolicy": {
"DependsOn": ["LambdaExecutionRole"],
"Type": "AWS::IAM::Policy",
"Properties": {
"PolicyName": "amplify-lambda-execution-policy",
"Roles": [{ "Ref": "LambdaExecutionRole" }],
"PolicyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"dynamodb:Put*",
"dynamodb:Create*",
"dynamodb:BatchWriteItem",
"dynamodb:Get*",
"dynamodb:BatchGetItem",
"dynamodb:List*",
"dynamodb:Describe*",
"dynamodb:Scan",
"dynamodb:Query",
"dynamodb:Update*",
"dynamodb:RestoreTable*",
"dynamodb:Delete*"
],
"Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
}
]
}
}
} If you used fake parameters, visit your Lambda function in your Lambda console and add the environment variables ( |
Any updates to this? It seems that the tutorial in the announcement is using two tables - one from the |
Any updates? there is currently no way of creating custom resolvers with lambda that can connect to dynamodb as far as I'm aware. |
looking forward to this |
I think you can add the function then go to the DynamoDB table, go to "Triggers" and click "Create trigger" and add the trigger manually to that existing function. |
Hi, Scraped the net for a solution that works from Amplify but finds very little (nothing), is this solved? I use the latest Amplify updated yesterday and I cannot still add a trigger for my function using amplify add function. My DDB was created with add api so suffering the same problems as above. "There are no DynamoDB resources configured in your project currently" Is there a valid workaround for Amplify? Any blog post that explains in more detail? thanks |
@Buder what I ended up doing is creating a function that accepts a |
Ok, so you created a standard lambda? Like amplify add function "Hello world" option? And then what makes it accept a ___TABLE_NAME?? And don't you have to enable stream/trigger on the table and add some lambda ID(ARN) to that trigger? And how do we set the IAM roles permission then? Not expecting you to help me with all this, but still valid questions.. :) |
@Buder Yes you create the function from the cli using If you want the function to only have access to that table only, you can scroll down in the lambda page and go to edit that function's IAM role, there you can edit the policy and add the policies that you need, for example dynamodb PutItem method, and you give it the table ARN so that it can only access that table and not other tables. |
Ok, I see, yes I can do this but it requires me to create some polling sequence to react to the INSERT of a new table item, what I am looking for is a trigger so that the lambda executers when a INSERT is executed in the table. As far as I understand your solution do not handle that. On the other hand I think that it is possible to associate the hello world lambda with the table trigger from DynamoDB by setting some data there and enabling stream for instance. |
@Buder This is actually something I am looking into right now, you can add a trigger through the DynamoDB console and create a lambda function or use an existing lambda function for that trigger, choosing the create a new function didn't work for me so I created one manually and I needed to add the correct policies to that function's role so that it will work with the dynamo trigger (dynamo will tell you what roles to add). |
Ok, let me know your findings here, valuable for all. I will try myself this weekend to set up a lambda and try to trigger it from the table insert. Do you know if this will work also when invoking the lambda with the amplify command (running the lambda locally I guess?) I will post if I get it working in some way. |
I guess it will work but it won't pass the parameters that the trigger is passing, instead you can just change/create some dummy data in the dynamodb table and then change it back and it should invoke the trigger |
It would be nice if we could have an |
All I needed to do was add the following Policy, I already had everything else needed in the function's cloud formation template: And I added "someTableStreamArn" as a parameter for the template. |
Any updates on this? |
Hey guys, we released this functionality in the latest version of our CLI - 4.16.1. This was merged as a part of this PR - #2463 |
Is it possible to make an existing function trigger from DynamoDB with |
@mrgrue did you find any solution for you problem?. I'm having the same situation. |
@vrebo Not really. I ended up just deleting the function and recreating it and making sure to set it up as a trigger at creation time. |
Yeah not a big deal. Move your function to like functions folder name,
recreate function throug cli, then copy the files from function2 folder
back over top of those in function.
…On Wed, Nov 25, 2020, 2:47 PM Mr. Grue ***@***.***> wrote:
@vrebo <https://github.com/vrebo> Not really. I ended up just deleting
the function and recreating it and making sure to set it up as a trigger at
creation time.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#997 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABUMX3E3V3WWXEDC5QJNDPDSRVNM7ANCNFSM4G4WKZGA>
.
|
can someone please create a separate feature request for this? Much appreciate! I spent close to half an hour fumbling to find a way to disable an existing trigger. |
I had some odd behavior when I was extending a S3 Trigger and wanting to add access to a existing App Sync DynamoDB. It seemed like it started making config changes then hit a circular dependency and failed. Took me about 3 hrs to sort out what it did. After doing this again I now see it edited the S3Triggeredabc123-cloudformation-template.json with a ampify-lambda-execution-policy "PolicyName" an also edited s3-Cloudformation-template.json with the exact same Policy name. Result is DynamoDB policy comes in an overwrites the prior losing S3 permissions. I went in and renamed them respectively to include a 's3' or 'dynamodb' prefix so they would not conflict. |
any updates on this? the cli will only generates policies for the appsync, we need the policies for dynamodb like this: {
"Effect": "Allow",
"Action": [
"dynamodb:GetItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:DeleteItem"
],
"Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
} |
This issue has been automatically locked since there hasn't been any recent activity after it was closed. Please open a new issue for related bugs. Looking for a help forum? We recommend joining the Amplify Community Discord server |
Is your feature request related to a problem? Please describe.
I've created a graphql api backend and declared tables in my schema using @model. I would like to create a function to populate a field on the table on insert. Attempting this caused the following error:
There are no DynamoDB resources configured in your project currently
Describe the solution you'd like
I'd like amplify to recognize the dynamoDB tables creates via the graphql schema so that I can add lambda function "triggers".
Describe alternatives you've considered
I'll probably do the work on the front end for now.
The text was updated successfully, but these errors were encountered: