Skip to content
This repository has been archived by the owner on Oct 17, 2019. It is now read-only.

Commit

Permalink
Merge pull request #17 from mikegrima/logging2
Browse files Browse the repository at this point in the history
More logging enhancements
  • Loading branch information
mikegrima committed Aug 29, 2018
2 parents 2716f19 + 035ff3e commit fb7ce26
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 1 deletion.
2 changes: 1 addition & 1 deletion historical_reports/__about__.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
__summary__ = "Collection of reporting functions built on top of Historical data sets."
__uri__ = "https://github.com/Netflix-Skunkworks/historical-reports"

__version__ = "0.1.10"
__version__ = "0.1.11"

__author__ = "The Historical developers"
__email__ = "security@netflix.com"
Expand Down
2 changes: 2 additions & 0 deletions historical_reports/s3/update.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,10 @@ def process_dynamodb_record(record, s3_report):
# If the current object is too big for SNS, and it's not in the current table, then delete it.
# -- OR -- if this a soft-deletion? (Config set to {})
if not modified_bucket or not modified_bucket.configuration.attribute_values:
log.debug('Processing deletion for: {}'.format(record['dynamodb']['NewImage']["BucketName"]["S"]))
s3_report["buckets"].pop(record['dynamodb']['NewImage']["BucketName"]["S"], None)
else:
log.debug('Processing: {}'.format(modified_bucket.BucketName))
s3_report["all_buckets"].append(modified_bucket)


Expand Down

0 comments on commit fb7ce26

Please sign in to comment.