Skip to content

schwamster/logging-lifecycle

Repository files navigation

logging-lifecycle

keywords: aws, terraform, kinesis, lambda, log groups, s3, glacier, sqs

this repo is supposed to provide an example of how to create a log flow. push your log events from the log groups to s3, glacier, and sqs. all resources will be created with terraform.

prerequisites

either install aws-cli and sign in or create or use an existing ssh key and you can then run the terraform scripts directly. Otherwise you add the keys in main.tf as explained here: terraform docs have some event source that pushes logs to a cloud watch log group. an easy example would be cloud trail -> cloud watch see here: http://docs.aws.amazon.com/awscloudtrail/latest/userguide/send-cloudtrail-events-to-cloudwatch-logs.html

getting started

make sure you somehow provide authentification (see prerequisites).

update variables.tf with your settings:

  • aws_region: region you want to use
  • log_group_name: log group that is used as the source for the events that will be pushed to kinesis

then run:

terraform plan

to check out what resources would be created

then run:

terraform apply

to create the resources (this can take a couple of minutes)

if you want to destroy the resources run:

terraform destroy

overview

overview

acknowledgement: diagram made with draw.io

disclaimer

running this terraform script is most likly going to cost you some money just as it would cost you money to create those resources via aws-cli, the console or other means.

The Lambda function that moves events to s3

The lambda function is created with dotnet core. I used amazons yeoman template to create => yeoman template github if you want to change the function you can build a new deployable package by navigating to MoveLogsToS3/src/MoveLogsToS3 and run

dotnet lambda package

then navigate to bin/Release/netcoreapp1.0 and copy MoveLogsToS3.zip to /lambdas