Skip to content

logzio/s3-hook

Repository files navigation

S3 Hook

This project creates an AWS Lambda function that sends logs from files stored in S3 bucket, to Logz.io

Instructions:

To deploy this project, click the button that matches the region you wish to deploy your Stack to:

Region Deployment
us-east-1 Deploy to AWS
us-east-2 Deploy to AWS
us-west-1 Deploy to AWS
us-west-2 Deploy to AWS
eu-central-1 Deploy to AWS
eu-north-1 Deploy to AWS
eu-west-1 Deploy to AWS
eu-west-2 Deploy to AWS
eu-west-3 Deploy to AWS
sa-east-1 Deploy to AWS
ap-northeast-1 Deploy to AWS
ap-northeast-2 Deploy to AWS
ap-northeast-3 Deploy to AWS
ap-south-1 Deploy to AWS
ap-southeast-1 Deploy to AWS
ap-southeast-2 Deploy to AWS
ca-central-1 Deploy to AWS

1. Specify stack details

Specify the stack details as per the table below, check the checkboxes and select Create stack.

Parameter Description Required/Default
logzioListener The Logz.io listener URL for your region. (For more details, see the regions page Required
logzioToken Your Logz.io log shipping token. Required
logLevel Log level for the Lambda function. Can be one of: debug, info, warn, error, fatal, panic. Default: info
logType The log type you'll use with this Lambda. This is shown in your logs under the type field in Kibana. Logz.io applies parsing based on the log type. Default: s3_hook
includePathsRegexes Comma-seperated list of regexes that match the paths you'd like to pull logs from. That field is mutually exclusive with the excludePathsRegexes field. -
excludePathsRegexes Comma-seperated list of regexes that match the paths that won't pull logs from. That field is mutually exclusive with the includePathsRegexes field. -
pathToFields Fields from the path to your logs directory that you want to add to the logs. For example, org-id/aws-type/account-id will add each of the fields org-id, aws-type and account-id to the logs that are fetched from the directory that this path refers to. -

2. Add trigger

Give the stack a few minutes to be deployed.

Once your Lambda function is ready, you'll need to manually add a trigger. This is due to Cloudformation limitations.

Go to the function's page, and click on Add trigger.

Step 5 screenshot

Then, choose S3 as a trigger, and fill in:

  • Bucket: Your bucket name.
  • Event type: Choose option All object create events.
  • Prefix and Suffix should be left empty.

Confirm the checkbox, and click *Add.

Step 5 screenshot

3. Send logs

That's it. Your function is configured. Once you upload new files to your bucket, it will trigger the function, and the logs will be sent to your Logz.io account.

Filtering files

If you want to pull logs from specific paths within the bucket, use the includePathsRegexes variable. Conversely, if there are particular paths within the bucket that you don't want to pull logs from, use the excludePathsRegexes variable. These fields are mutually exclusive. Both variables should contain a comma-separated list of regular expressions that correspond to the paths from which you want to extract logs (includePathsRegexes) or the paths you want to exclude when extracting logs (excludePathsRegexes). Note: This will still trigger your Lambda function every time a new object is added to your bucket. However, if the key does not match the regexes, the function will quit and won't send the logs.

Adding object path as logs field

In case you want to use your objects' path as extra fields in your logs, you can do so by using pathToFields.

For example, if your objects are under the path: oi-3rfEFA4/AWSLogs/2378194514/file.log, where oi-3rfEFA4 is org id, AWSLogs is aws type, and 2378194514 is account id.

Setting pathToFields with the value: org-id/aws-type/account-id will add to logs the following fields: org-id: oi-3rfEFA4, aws-type: AWSLogs, account-id: 2378194514.

Important notes about pathToFields:

  1. This will override a field with the same key, if it exists.
  2. In order for the feature to work, you need to set pathToFields from the root of the bucket.

Automatic parsing:

S3 Hook will automatically parse logs in the following cases:

  • The object's path contains the phrase cloudtrail (case insensitive).

Control Tower support:

If you want to ship Control Tower logs, after the deployment of the S3 Hook stack, you'll need to deploy an additional stack. For more details click here.

Changelog

  • 0.4.1:
    • Upgrading the control tower runtime to provided.al2023 to ensure compatibility with AWS.
  • 0.4.0:
    • Extend log processing to include .txt, .log, and .json files; exclude unsupported file types for improved efficiency.
    • Update AWS Lambda runtime to provided.al2023 for enhanced performance and compatibility.
  • 0.3.0:
    • Add new option to exclude path from where won't pull logs.
    • Rename field pathsRegexes to includePathsRegexes.
  • 0.2.1:
    • Bug fix - remove redundant usage of Content Type.
  • 0.2.0:
    • Cloudformation template changes:
      • Lambda function name will be the same as the stack name.
      • IAM Role will have a unique name.
      • IAM Role allows GetObject action for all buckets.
    • Removal of Parameter bucketName.
  • 0.1.0:
    • Add ability to filter paths with regex list in field pathsRegexes.
    • Add ability to map bucket path as log fields with pathToFields.
    • Add support for Control Tower.
    • Automatically detect and parse CloudTrail logs.
  • 0.0.2:
    • Bug fix: Decodes folder names, for folders with special characters.
  • 0.0.1: Initial release.