Skip to content
Permalink
Browse files

Use default ACL for uploaded lambda code (#682)

* Use default ACL for uploaded lambda code

The "Authenticated-Read" ACL, currently set on all uploads, allows your code
to be read by all S3 users. Default behavior should be to use the permissions
implied by the bucket policy, i.e. "private".

Organizations that do not grant SetObjectAcl permissions (for fear of
data loss) will block this call.

* Add config option to make default uploads private

Per PR#682, although the default can be changed to 'private',
we should allow users to set 'authenticated-read' if they
desire.

Adds a new configuration option, payload_acl, to define this.
  • Loading branch information...
zollman authored and phobologic committed Dec 3, 2018
1 parent fe0086c commit 37cd35143be57726ee355d37fe6c303213cb8366
Showing with 18 additions and 5 deletions.
  1. +18 −5 stacker/hooks/aws_lambda.py
@@ -194,7 +194,8 @@ def _head_object(s3_conn, bucket, key):
raise


def _upload_code(s3_conn, bucket, prefix, name, contents, content_hash):
def _upload_code(s3_conn, bucket, prefix, name, contents, content_hash,
payload_acl):
"""Upload a ZIP file to S3 for use by Lambda.
The key used for the upload will be unique based on the checksum of the
@@ -210,6 +211,8 @@ def _upload_code(s3_conn, bucket, prefix, name, contents, content_hash):
construct a key name for the uploaded file.
contents (str): byte string with the content of the file upload.
content_hash (str): md5 hash of the contents to be uploaded.
payload_acl (str): The canned S3 object ACL to be applied to the
uploaded payload
Returns:
troposphere.awslambda.Code: CloudFormation Lambda Code object,
@@ -229,7 +232,7 @@ def _upload_code(s3_conn, bucket, prefix, name, contents, content_hash):
logger.info('lambda: uploading object %s', key)
s3_conn.put_object(Bucket=bucket, Key=key, Body=contents,
ContentType='application/zip',
ACL='authenticated-read')
ACL=payload_acl)

return Code(S3Bucket=bucket, S3Key=key)

@@ -269,7 +272,8 @@ def _check_pattern_list(patterns, key, default=None):
'list of strings'.format(key))


def _upload_function(s3_conn, bucket, prefix, name, options, follow_symlinks):
def _upload_function(s3_conn, bucket, prefix, name, options, follow_symlinks,
payload_acl):
"""Builds a Lambda payload from user configuration and uploads it to S3.
Args:
@@ -292,6 +296,8 @@ def _upload_function(s3_conn, bucket, prefix, name, options, follow_symlinks):
file patterns to exclude from the payload (optional).
follow_symlinks (bool): If true, symlinks will be included in the
resulting zip file
payload_acl (str): The canned S3 object ACL to be applied to the
uploaded payload
Returns:
troposphere.awslambda.Code: CloudFormation AWS Lambda Code object,
@@ -326,7 +332,7 @@ def _upload_function(s3_conn, bucket, prefix, name, options, follow_symlinks):
follow_symlinks)

return _upload_code(s3_conn, bucket, prefix, name, zip_contents,
content_hash)
content_hash, payload_acl)


def select_bucket_region(custom_bucket, hook_region, stacker_bucket_region,
@@ -385,6 +391,8 @@ def upload_lambda_functions(context, provider, **kwargs):
zip name.
follow_symlinks (bool, optional): Will determine if symlinks should
be followed and included with the zip artifact. Default: False
payload_acl (str, optional): The canned S3 object ACL to be applied to
the uploaded payload. Default: private
functions (dict):
Configurations of desired payloads to build. Keys correspond to
function names, used to derive key names for the payload. Each
@@ -438,6 +446,7 @@ def upload_lambda_functions(context, provider, **kwargs):
bucket: custom-bucket
follow_symlinks: true
prefix: cloudformation-custom-resources/
payload_acl: authenticated-read
functions:
MyFunction:
path: ./lambda_functions
@@ -494,6 +503,10 @@ def create_template(self):
if not isinstance(follow_symlinks, bool):
raise ValueError('follow_symlinks option must be a boolean')

# Check for S3 object acl. Valid values from:
# https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl
payload_acl = kwargs.get('payload_acl', 'private')

# Always use the global client for s3
session = get_session(bucket_region)
s3_client = session.client('s3')
@@ -505,6 +518,6 @@ def create_template(self):
results = {}
for name, options in kwargs['functions'].items():
results[name] = _upload_function(s3_client, bucket_name, prefix, name,
options, follow_symlinks)
options, follow_symlinks, payload_acl)

return results

0 comments on commit 37cd351

Please sign in to comment.
You can’t perform that action at this time.