Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to add DynamoDB triggers via amplify add function #997

Closed
kldeb opened this issue Mar 8, 2019 · 30 comments
Closed

Ability to add DynamoDB triggers via amplify add function #997

kldeb opened this issue Mar 8, 2019 · 30 comments

Comments

@kldeb
Copy link
Contributor

@kldeb kldeb commented Mar 8, 2019

Is your feature request related to a problem? Please describe.
I've created a graphql api backend and declared tables in my schema using @model. I would like to create a function to populate a field on the table on insert. Attempting this caused the following error:

There are no DynamoDB resources configured in your project currently

Describe the solution you'd like
I'd like amplify to recognize the dynamoDB tables creates via the graphql schema so that I can add lambda function "triggers".

Describe alternatives you've considered
I'll probably do the work on the front end for now.

@cptflammin
Copy link

@cptflammin cptflammin commented Mar 19, 2019

Indeed, came across that problem too. Current workaround is to trigger a GraphQL mutation from Lambda, but that's a bit weird.
https://read.acloud.guru/backend-graphql-how-to-trigger-an-aws-appsync-mutation-from-aws-lambda-eda13ebc96c3

@mikeparisstuff @kaustavghosh06 @UnleashedMind
Question:

  • by design, better to go through GraphQL mutation anyway rather than accessing directly DDB from lambda ?
  • What about side effect if schema gets modified and Lambda is not updated.. ?

@rawadrifai
Copy link

@rawadrifai rawadrifai commented Jun 2, 2019

I would second this request.. the ability to configure a lambda to trigger based on Dynamo or Cognito or even SQS.

@YikSanChan
Copy link
Contributor

@YikSanChan YikSanChan commented Jun 7, 2019

Steps to reproduce:

> amplify function add
Using service: Lambda, provided by: awscloudformation
? Provide a friendly name for your resource to be used as a label for this category in the project: myfunction
? Provide the AWS Lambda function name: myfunction
? Choose the function template that you want to use: CRUD function for Amazon DynamoDB table (Integra
tion with Amazon API Gateway and Amazon DynamoDB)
? Choose a DynamoDB data source option Use DynamoDB table configured in the current Amplify project
There are no DynamoDB resources configured in your project currently

Have configured API backed by DynamoDB.

@janhesters
Copy link

@janhesters janhesters commented Jun 10, 2019

Also have this issue. It seems like creating a Lambda function doesn't recognize the DynamoDB backend created when using amplify add api and you choose graphql.

@janhesters
Copy link

@janhesters janhesters commented Jun 17, 2019

Update:
If the DynamoDB created by amplify add api would be displayed in amplify-meta.json you could access it via the environment variables in the CloudFormation template (similar to how env and <you-appsync-api>GraphQLAPIIdOutput is accessed in "Parameters").

Until that is the case or until you can choose your existing DynamoDB in the Amplify prompts, here is a manual workaround:

Add you parameters (note AVOID COMMITTING THESE TO GIT, or use fake):

	"Parameters": {
		"env": {
			"Type": "String"
		},
		"storagetododynamoName": {
			"Type": "String",
			"Default": "<your-db-name>"
		},
		"storagetododynamoArn": {
			"Type": "String",
			"Default": "<your-db-arn>"
		}
	},

Append the policies manually which would be automatically generated for you:

"lambdaexecutionpolicy": {
    "DependsOn": ["LambdaExecutionRole"],
    "Type": "AWS::IAM::Policy",
    "Properties": {
    "PolicyName": "lambda-execution-policy",
    "Roles": [{ "Ref": "LambdaExecutionRole" }],
    "PolicyDocument": {
        "Version": "2012-10-17",
        "Statement": [
        {
            "Effect": "Allow",
            "Action": [
            "logs:CreateLogGroup",
            "logs:CreateLogStream",
            "logs:PutLogEvents"
            ],
            "Resource": {
            "Fn::Sub": [
                "arn:aws:logs:${region}:${account}:log-group:/aws/lambda/${lambda}:log-stream:*",
                {
                "region": { "Ref": "AWS::Region" },
                "account": { "Ref": "AWS::AccountId" },
                "lambda": { "Ref": "LambdaFunction" }
                }
            ]
            }
        },
        {
            "Effect": "Allow",
            "Action": [
            "dynamodb:GetItem",
            "dynamodb:Query",
            "dynamodb:Scan",
            "dynamodb:PutItem",
            "dynamodb:UpdateItem",
            "dynamodb:DeleteItem"
            ],
            "Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
        }
        ]
    }
    }
},
"AmplifyResourcesPolicy": {
    "DependsOn": ["LambdaExecutionRole"],
    "Type": "AWS::IAM::Policy",
    "Properties": {
    "PolicyName": "amplify-lambda-execution-policy",
    "Roles": [{ "Ref": "LambdaExecutionRole" }],
    "PolicyDocument": {
        "Version": "2012-10-17",
        "Statement": [
        {
            "Effect": "Allow",
            "Action": [
            "dynamodb:Put*",
            "dynamodb:Create*",
            "dynamodb:BatchWriteItem",
            "dynamodb:Get*",
            "dynamodb:BatchGetItem",
            "dynamodb:List*",
            "dynamodb:Describe*",
            "dynamodb:Scan",
            "dynamodb:Query",
            "dynamodb:Update*",
            "dynamodb:RestoreTable*",
            "dynamodb:Delete*"
            ],
            "Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
        }
        ]
    }
    }
}

If you used fake parameters, visit your Lambda function in your Lambda console and add the environment variables (storage<your-db-name>dynamoName and storage<your-db-name>dynamoArn) after you pushed the function (it will overwrite any existing variables). That way you can use you DynamoDB as if you had chosen CRUD function for Amazon DynamoDB table (Integration with Amazon API Gateway and Amazon DynamoDB) in amplify add auth.

Screenshot 2019-06-17 at 11 02 43

@regenrek
Copy link

@regenrek regenrek commented Jun 24, 2019

Any updates to this? It seems that the tutorial in the announcement is using two tables - one from the graphql.schema with @model directive and one from the storage

https://aws.amazon.com/de/blogs/mobile/amplify-framework-adds-support-for-aws-lambda-functions-and-amazon-dynamodb-custom-indexes-in-graphql-schemas/#

@idanlo
Copy link

@idanlo idanlo commented Jul 22, 2019

Any updates? there is currently no way of creating custom resolvers with lambda that can connect to dynamodb as far as I'm aware.

@monaye
Copy link

@monaye monaye commented Aug 2, 2019

looking forward to this

@irelandpaul
Copy link

@irelandpaul irelandpaul commented Aug 9, 2019

I think you can add the function then go to the DynamoDB table, go to "Triggers" and click "Create trigger" and add the trigger manually to that existing function.

@Buder
Copy link

@Buder Buder commented Sep 20, 2019

Hi,

Scraped the net for a solution that works from Amplify but finds very little (nothing), is this solved? I use the latest Amplify updated yesterday and I cannot still add a trigger for my function using amplify add function. My DDB was created with add api so suffering the same problems as above.

"There are no DynamoDB resources configured in your project currently"

Is there a valid workaround for Amplify? Any blog post that explains in more detail?

thanks

@idanlo
Copy link

@idanlo idanlo commented Sep 20, 2019

@Buder what I ended up doing is creating a function that accepts a ___TABLE_NAME environment variable and just manually set it in the AWS Lambda console and if you create a new API or change the table names you will need to go to that lambda's console and set it manually again.

@Buder
Copy link

@Buder Buder commented Sep 20, 2019

@idanlo

Ok, so you created a standard lambda? Like amplify add function "Hello world" option? And then what makes it accept a ___TABLE_NAME??

And don't you have to enable stream/trigger on the table and add some lambda ID(ARN) to that trigger? And how do we set the IAM roles permission then?

Not expecting you to help me with all this, but still valid questions.. :)

@idanlo
Copy link

@idanlo idanlo commented Sep 20, 2019

@Buder Yes you create the function from the cli using amplify add function, and to give it the environment variable you go to the that lambda's page using the AWS Lambda console and when you scroll down you can see a list of environment variable and there you can add your own, so for example you can add a USER_TABLE_NAME variable and then in the code you can access it to perform actions on a DynamoDB table.

If you want the function to only have access to that table only, you can scroll down in the lambda page and go to edit that function's IAM role, there you can edit the policy and add the policies that you need, for example dynamodb PutItem method, and you give it the table ARN so that it can only access that table and not other tables.

@Buder
Copy link

@Buder Buder commented Sep 20, 2019

@idanlo

Ok, I see, yes I can do this but it requires me to create some polling sequence to react to the INSERT of a new table item, what I am looking for is a trigger so that the lambda executers when a INSERT is executed in the table. As far as I understand your solution do not handle that.

On the other hand I think that it is possible to associate the hello world lambda with the table trigger from DynamoDB by setting some data there and enabling stream for instance.

@idanlo
Copy link

@idanlo idanlo commented Sep 20, 2019

@Buder This is actually something I am looking into right now, you can add a trigger through the DynamoDB console and create a lambda function or use an existing lambda function for that trigger, choosing the create a new function didn't work for me so I created one manually and I needed to add the correct policies to that function's role so that it will work with the dynamo trigger (dynamo will tell you what roles to add).

@Buder
Copy link

@Buder Buder commented Sep 20, 2019

@idanlo

Ok, let me know your findings here, valuable for all. I will try myself this weekend to set up a lambda and try to trigger it from the table insert. Do you know if this will work also when invoking the lambda with the amplify command (running the lambda locally I guess?)

I will post if I get it working in some way.

@idanlo
Copy link

@idanlo idanlo commented Sep 20, 2019

I guess it will work but it won't pass the parameters that the trigger is passing, instead you can just change/create some dummy data in the dynamodb table and then change it back and it should invoke the trigger

@kldeb
Copy link
Contributor Author

@kldeb kldeb commented Sep 26, 2019

It would be nice if we could have an @trigger directive that works like the @function directive.

@ambientlight
Copy link
Contributor

@ambientlight ambientlight commented Sep 30, 2019

A pull request resolving this has been added: #2463

@aireater:
this doesn't add a @trigger transformer through, that could be another feature request. But maybe keeping anything unrelated with graphql API away from transformers would be an easier to manage and slightly more flexible.

@mormsbee
Copy link

@mormsbee mormsbee commented Nov 11, 2019

All I needed to do was add the following Policy, I already had everything else needed in the function's cloud formation template:
"someTableTrigger": { "Type": "AWS::Lambda::EventSourceMapping", "DependsOn": [ "AmplifyResourcesPolicy" ], "Properties": { "BatchSize": 1, "Enabled": true, "EventSourceArn": { "Ref": "someTableStreamArn" }, "FunctionName": { "Fn::GetAtt": [ "LambdaFunction", "Arn" ] }, "StartingPosition": "LATEST" } }

And I added "someTableStreamArn" as a parameter for the template.

@ohsik
Copy link

@ohsik ohsik commented Mar 11, 2020

Any updates on this?

@kaustavghosh06
Copy link
Contributor

@kaustavghosh06 kaustavghosh06 commented Mar 11, 2020

Hey guys, we released this functionality in the latest version of our CLI - 4.16.1. This was merged as a part of this PR - #2463

@mrgrue
Copy link

@mrgrue mrgrue commented Jun 24, 2020

Is it possible to make an existing function trigger from DynamoDB with amplify function update, or can I only do that when I create the function?

@vrebo
Copy link

@vrebo vrebo commented Nov 25, 2020

@mrgrue did you find any solution for you problem?. I'm having the same situation.

@mrgrue
Copy link

@mrgrue mrgrue commented Nov 25, 2020

@vrebo Not really. I ended up just deleting the function and recreating it and making sure to set it up as a trigger at creation time.

@mormsbee
Copy link

@mormsbee mormsbee commented Nov 25, 2020

@nubpro
Copy link
Contributor

@nubpro nubpro commented Dec 2, 2020

Is it possible to make an existing function trigger from DynamoDB with amplify function update, or can I only do that when I create the function?

can someone please create a separate feature request for this? Much appreciate!

I spent close to half an hour fumbling to find a way to disable an existing trigger.

@cybercussion
Copy link

@cybercussion cybercussion commented Jan 5, 2021

I had some odd behavior when I was extending a S3 Trigger and wanting to add access to a existing App Sync DynamoDB. It seemed like it started making config changes then hit a circular dependency and failed. Took me about 3 hrs to sort out what it did.
It removed the S3 policy and I could not figure out how to put it back. Had to stand up another environment to compare what was different.

After doing this again I now see it edited the S3Triggeredabc123-cloudformation-template.json with a ampify-lambda-execution-policy "PolicyName" an also edited s3-Cloudformation-template.json with the exact same Policy name. Result is DynamoDB policy comes in an overwrites the prior losing S3 permissions. I went in and renamed them respectively to include a 's3' or 'dynamodb' prefix so they would not conflict.

@yaquawa
Copy link

@yaquawa yaquawa commented Feb 10, 2021

any updates on this? the cli will only generates policies for the appsync, we need the policies for dynamodb like this:

        {
            "Effect": "Allow",
            "Action": [
            "dynamodb:GetItem",
            "dynamodb:Query",
            "dynamodb:Scan",
            "dynamodb:PutItem",
            "dynamodb:UpdateItem",
            "dynamodb:DeleteItem"
            ],
            "Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
        }

@github-actions
Copy link

@github-actions github-actions bot commented May 25, 2021

This issue has been automatically locked since there hasn't been any recent activity after it was closed. Please open a new issue for related bugs.

Looking for a help forum? We recommend joining the Amplify Community Discord server *-help channels for those types of questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 25, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet