Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to add DynamoDB triggers via amplify add function #997

Open
aireater opened this issue Mar 8, 2019 · 20 comments

Comments

@aireater
Copy link

@aireater aireater commented Mar 8, 2019

Is your feature request related to a problem? Please describe.
I've created a graphql api backend and declared tables in my schema using @model. I would like to create a function to populate a field on the table on insert. Attempting this caused the following error:

There are no DynamoDB resources configured in your project currently

Describe the solution you'd like
I'd like amplify to recognize the dynamoDB tables creates via the graphql schema so that I can add lambda function "triggers".

Describe alternatives you've considered
I'll probably do the work on the front end for now.

@cptflammin

This comment has been minimized.

Copy link

@cptflammin cptflammin commented Mar 19, 2019

Indeed, came across that problem too. Current workaround is to trigger a GraphQL mutation from Lambda, but that's a bit weird.
https://read.acloud.guru/backend-graphql-how-to-trigger-an-aws-appsync-mutation-from-aws-lambda-eda13ebc96c3

@mikeparisstuff @kaustavghosh06 @UnleashedMind
Question:

  • by design, better to go through GraphQL mutation anyway rather than accessing directly DDB from lambda ?
  • What about side effect if schema gets modified and Lambda is not updated.. ?
@rawadrifai

This comment has been minimized.

Copy link

@rawadrifai rawadrifai commented Jun 2, 2019

I would second this request.. the ability to configure a lambda to trigger based on Dynamo or Cognito or even SQS.

@YikSanChan

This comment has been minimized.

Copy link
Contributor

@YikSanChan YikSanChan commented Jun 7, 2019

Steps to reproduce:

> amplify function add
Using service: Lambda, provided by: awscloudformation
? Provide a friendly name for your resource to be used as a label for this category in the project: myfunction
? Provide the AWS Lambda function name: myfunction
? Choose the function template that you want to use: CRUD function for Amazon DynamoDB table (Integra
tion with Amazon API Gateway and Amazon DynamoDB)
? Choose a DynamoDB data source option Use DynamoDB table configured in the current Amplify project
There are no DynamoDB resources configured in your project currently

Have configured API backed by DynamoDB.

@janhesters

This comment has been minimized.

Copy link

@janhesters janhesters commented Jun 10, 2019

Also have this issue. It seems like creating a Lambda function doesn't recognize the DynamoDB backend created when using amplify add api and you choose graphql.

@janhesters

This comment has been minimized.

Copy link

@janhesters janhesters commented Jun 17, 2019

Update:
If the DynamoDB created by amplify add api would be displayed in amplify-meta.json you could access it via the environment variables in the CloudFormation template (similar to how env and <you-appsync-api>GraphQLAPIIdOutput is accessed in "Parameters").

Until that is the case or until you can choose your existing DynamoDB in the Amplify prompts, here is a manual workaround:

Add you parameters (note AVOID COMMITTING THESE TO GIT, or use fake):

	"Parameters": {
		"env": {
			"Type": "String"
		},
		"storagetododynamoName": {
			"Type": "String",
			"Default": "<your-db-name>"
		},
		"storagetododynamoArn": {
			"Type": "String",
			"Default": "<your-db-arn>"
		}
	},

Append the policies manually which would be automatically generated for you:

"lambdaexecutionpolicy": {
    "DependsOn": ["LambdaExecutionRole"],
    "Type": "AWS::IAM::Policy",
    "Properties": {
    "PolicyName": "lambda-execution-policy",
    "Roles": [{ "Ref": "LambdaExecutionRole" }],
    "PolicyDocument": {
        "Version": "2012-10-17",
        "Statement": [
        {
            "Effect": "Allow",
            "Action": [
            "logs:CreateLogGroup",
            "logs:CreateLogStream",
            "logs:PutLogEvents"
            ],
            "Resource": {
            "Fn::Sub": [
                "arn:aws:logs:${region}:${account}:log-group:/aws/lambda/${lambda}:log-stream:*",
                {
                "region": { "Ref": "AWS::Region" },
                "account": { "Ref": "AWS::AccountId" },
                "lambda": { "Ref": "LambdaFunction" }
                }
            ]
            }
        },
        {
            "Effect": "Allow",
            "Action": [
            "dynamodb:GetItem",
            "dynamodb:Query",
            "dynamodb:Scan",
            "dynamodb:PutItem",
            "dynamodb:UpdateItem",
            "dynamodb:DeleteItem"
            ],
            "Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
        }
        ]
    }
    }
},
"AmplifyResourcesPolicy": {
    "DependsOn": ["LambdaExecutionRole"],
    "Type": "AWS::IAM::Policy",
    "Properties": {
    "PolicyName": "amplify-lambda-execution-policy",
    "Roles": [{ "Ref": "LambdaExecutionRole" }],
    "PolicyDocument": {
        "Version": "2012-10-17",
        "Statement": [
        {
            "Effect": "Allow",
            "Action": [
            "dynamodb:Put*",
            "dynamodb:Create*",
            "dynamodb:BatchWriteItem",
            "dynamodb:Get*",
            "dynamodb:BatchGetItem",
            "dynamodb:List*",
            "dynamodb:Describe*",
            "dynamodb:Scan",
            "dynamodb:Query",
            "dynamodb:Update*",
            "dynamodb:RestoreTable*",
            "dynamodb:Delete*"
            ],
            "Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
        }
        ]
    }
    }
}

If you used fake parameters, visit your Lambda function in your Lambda console and add the environment variables (storage<your-db-name>dynamoName and storage<your-db-name>dynamoArn) after you pushed the function (it will overwrite any existing variables). That way you can use you DynamoDB as if you had chosen CRUD function for Amazon DynamoDB table (Integration with Amazon API Gateway and Amazon DynamoDB) in amplify add auth.

Screenshot 2019-06-17 at 11 02 43

@regenrek

This comment has been minimized.

Copy link

@regenrek regenrek commented Jun 24, 2019

Any updates to this? It seems that the tutorial in the announcement is using two tables - one from the graphql.schema with @model directive and one from the storage

https://aws.amazon.com/de/blogs/mobile/amplify-framework-adds-support-for-aws-lambda-functions-and-amazon-dynamodb-custom-indexes-in-graphql-schemas/#

@idanlo

This comment has been minimized.

Copy link

@idanlo idanlo commented Jul 22, 2019

Any updates? there is currently no way of creating custom resolvers with lambda that can connect to dynamodb as far as I'm aware.

@monaye

This comment has been minimized.

Copy link

@monaye monaye commented Aug 2, 2019

looking forward to this

@irelandpaul

This comment has been minimized.

Copy link

@irelandpaul irelandpaul commented Aug 9, 2019

I think you can add the function then go to the DynamoDB table, go to "Triggers" and click "Create trigger" and add the trigger manually to that existing function.

@Buder

This comment has been minimized.

Copy link

@Buder Buder commented Sep 20, 2019

Hi,

Scraped the net for a solution that works from Amplify but finds very little (nothing), is this solved? I use the latest Amplify updated yesterday and I cannot still add a trigger for my function using amplify add function. My DDB was created with add api so suffering the same problems as above.

"There are no DynamoDB resources configured in your project currently"

Is there a valid workaround for Amplify? Any blog post that explains in more detail?

thanks

@idanlo

This comment has been minimized.

Copy link

@idanlo idanlo commented Sep 20, 2019

@Buder what I ended up doing is creating a function that accepts a ___TABLE_NAME environment variable and just manually set it in the AWS Lambda console and if you create a new API or change the table names you will need to go to that lambda's console and set it manually again.

@Buder

This comment has been minimized.

Copy link

@Buder Buder commented Sep 20, 2019

@idanlo

Ok, so you created a standard lambda? Like amplify add function "Hello world" option? And then what makes it accept a ___TABLE_NAME??

And don't you have to enable stream/trigger on the table and add some lambda ID(ARN) to that trigger? And how do we set the IAM roles permission then?

Not expecting you to help me with all this, but still valid questions.. :)

@idanlo

This comment has been minimized.

Copy link

@idanlo idanlo commented Sep 20, 2019

@Buder Yes you create the function from the cli using amplify add function, and to give it the environment variable you go to the that lambda's page using the AWS Lambda console and when you scroll down you can see a list of environment variable and there you can add your own, so for example you can add a USER_TABLE_NAME variable and then in the code you can access it to perform actions on a DynamoDB table.

If you want the function to only have access to that table only, you can scroll down in the lambda page and go to edit that function's IAM role, there you can edit the policy and add the policies that you need, for example dynamodb PutItem method, and you give it the table ARN so that it can only access that table and not other tables.

@Buder

This comment has been minimized.

Copy link

@Buder Buder commented Sep 20, 2019

@idanlo

Ok, I see, yes I can do this but it requires me to create some polling sequence to react to the INSERT of a new table item, what I am looking for is a trigger so that the lambda executers when a INSERT is executed in the table. As far as I understand your solution do not handle that.

On the other hand I think that it is possible to associate the hello world lambda with the table trigger from DynamoDB by setting some data there and enabling stream for instance.

@idanlo

This comment has been minimized.

Copy link

@idanlo idanlo commented Sep 20, 2019

@Buder This is actually something I am looking into right now, you can add a trigger through the DynamoDB console and create a lambda function or use an existing lambda function for that trigger, choosing the create a new function didn't work for me so I created one manually and I needed to add the correct policies to that function's role so that it will work with the dynamo trigger (dynamo will tell you what roles to add).

@Buder

This comment has been minimized.

Copy link

@Buder Buder commented Sep 20, 2019

@idanlo

Ok, let me know your findings here, valuable for all. I will try myself this weekend to set up a lambda and try to trigger it from the table insert. Do you know if this will work also when invoking the lambda with the amplify command (running the lambda locally I guess?)

I will post if I get it working in some way.

@idanlo

This comment has been minimized.

Copy link

@idanlo idanlo commented Sep 20, 2019

I guess it will work but it won't pass the parameters that the trigger is passing, instead you can just change/create some dummy data in the dynamodb table and then change it back and it should invoke the trigger

@aireater

This comment has been minimized.

Copy link
Author

@aireater aireater commented Sep 26, 2019

It would be nice if we could have an @trigger directive that works like the @function directive.

@ambientlight

This comment has been minimized.

Copy link
Contributor

@ambientlight ambientlight commented Sep 30, 2019

A pull request resolving this has been added: #2463

@aireater:
this doesn't add a @trigger transformer through, that could be another feature request. But maybe keeping anything unrelated with graphql API away from transformers would be an easier to manage and slightly more flexible.

@mormsbee

This comment has been minimized.

Copy link

@mormsbee mormsbee commented Nov 11, 2019

All I needed to do was add the following Policy, I already had everything else needed in the function's cloud formation template:
"someTableTrigger": { "Type": "AWS::Lambda::EventSourceMapping", "DependsOn": [ "AmplifyResourcesPolicy" ], "Properties": { "BatchSize": 1, "Enabled": true, "EventSourceArn": { "Ref": "someTableStreamArn" }, "FunctionName": { "Fn::GetAtt": [ "LambdaFunction", "Arn" ] }, "StartingPosition": "LATEST" } }

And I added "someTableStreamArn" as a parameter for the template.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
You can’t perform that action at this time.