This Lambda function validates CSV files uploaded to an S3 bucket and moves erroneous files to an error bucket.
- Validation Checks:
- Validates
product_line(Bakery, Meat, Dairy). - Validates
currency(USD, MXN, CAD). - Checks
dateformat (YYYY-MM-DD).
- Validates
- Error Handling:
- Moves files with errors to a designated error bucket.
- Deletes the original file after moving.
-
Buckets:
- Primary Bucket: Upload CSV files here.
- Error Bucket: Stores files with validation errors (configured as
billing-test-error-111in the code).
-
Trigger:
- Configure an S3 PUT event trigger on the primary bucket to invoke this Lambda.
-
Permissions:
- Ensure the Lambda has IAM permissions to read/write/delete from both S3 buckets.
- A CSV is uploaded to the primary bucket.
- The Lambda checks each row for valid data.
- If errors are found:
- File is copied to the error bucket.
- Original file is deleted.
- Returns a
200status if no errors are detected.
💡 Note: Update the error bucket variable in the code if renaming the error bucket.
This Lambda processes CSV files from an S3 bucket, converts billing amounts to USD, and stores the data in DynamoDB.
- Process CSV Data: Reads CSV files uploaded to S3.
- Currency Conversion: Converts
bill_amountto USD using hardcoded rates for CAD, MXN, and USD. - DynamoDB Storage: Inserts processed records into a DynamoDB table (
dynamo-billing).
- S3 Bucket: Configured to trigger this Lambda on
PUTevents (CSV uploads). - DynamoDB Table:
- Name:
dynamo-billing(in regionus-east-1). - Required Attributes:
{ "id": "Number", "company_name": "String", "country": "String", "city": "String", "product_line": "String", "item": "String", "bill_date": "String", "currency": "String", "bill_amount": "String", "usd_amount": "String" }
- Name:
- IAM Permissions:
- Lambda must have access to:
- Read from the S3 bucket.
- Write to DynamoDB (
dynamo-billing).
- Lambda must have access to:
Columns must include (in order):
id, company_name, country, city, product_line, item, bill_date, currency, bill_amount
- Trigger: CSV file uploaded to S3.
- Processing:
- Converts
bill_amountto USD using predefined rates. - Skips the CSV header row.
- Converts
- Storage: Each row is inserted into DynamoDB.
- Output: Prints success/errors to CloudWatch logs.
- Currency Support: Only
USD,CAD, andMXNare supported. Unsupported currencies will log an error but still insert data withusd_amount=0. - Hardcoded Rates: Update
currency_conversion_usdin the code if rates change. - Error Handling: Errors during DynamoDB insertion are logged but do not halt execution.
SUCCESS <======= # Per successful insert
LAMBDA HAS FINISHED <<<<------- # Final completion message
This Lambda function automates the creation of EBS volume snapshots for EC2 instances and tags them with the creation date.
- Automated Snapshots: Creates a snapshot of a specified EBS volume.
- Tagging: Adds a
Nametag with the snapshot date (e.g.,My EC2 Snapshot-2023-10-01). - Error Handling: Returns success/error responses for debugging.
- Volume ID: Replace the hardcoded
vol-025ee49cd5e5db665with your target EBS volume ID. - IAM Permissions:
- The Lambda execution role must include:
ec2:CreateSnapshot(to create snapshots).ec2:CreateTags(to tag snapshots).
- The Lambda execution role must include:
- Trigger: Configure a trigger (e.g., CloudWatch Events for scheduled snapshots).
- Trigger: Lambda is invoked (manually or via a trigger).
- Snapshot Creation:
- Creates a snapshot of the specified EBS volume.
- Tags it with a
Namecontaining the current date.
- Response:
- Returns
200with snapshot details on success. - Returns
400with error details on failure.
- Returns
Success:
{
"statusCode": 200,
"body": "{'SnapshotId': 'snap-123abc...', 'StartTime': '2023-10-01...'}"
} Error:
{
"statusCode": 400,
"body": "{'error': 'VolumeInUse: Volume is attached to an instance'}"
} - Hardcoded Volume ID: Update
VolumeIdin the code to match your EBS volume. - Tagging: Snapshots are tagged for easier identification and cost tracking.
- Logging: Uncomment the
loggerlines to enable detailed CloudWatch logging.
- Daily/weekly backup schedules.
- Pre-upgrade/system migration backups.