Skip to content

Commit

Permalink
fix(package): Only package if files have changed (#1789)
Browse files Browse the repository at this point in the history
  • Loading branch information
sriram-mv committed Feb 18, 2020
1 parent 3f4cfef commit bfd327f
Show file tree
Hide file tree
Showing 11 changed files with 333 additions and 36 deletions.
143 changes: 143 additions & 0 deletions designs/md5_checksums_package_deploy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,143 @@
Checksum on artifacts for `sam package`
====================================


What is the problem?
--------------------

Today, `sam package` goes through the list of packageable paths and looks up objects in s3 and compares checksums across local and whats in S3 (if they already exist). The comparison on `zip` files are prone to failure as the zipped file does not have respect permissions of the underlying directory which was zipped. Therefore the calculated checksums between local and S3 are different, resulting in re-upload when deploying an application repeatedly even with no changes in source.

Lets consider following cases:

NOTE: `sam deploy` attempts to package on deploy.

`sam build` -> `sam deploy` : Results in upload to s3
`sam build` -> `sam deploy` (s3 upload) -> `sam deploy` (s3 upload again on same built artifacts)

What will be changed?
---------------------

Instead of calculating checksum on a zip file, the checksum is calculated on the directory which is to be zipped up instead.

* Symlinks within the directory are followed.
* Cyclic symlinks cause failure to package.
* Both name and content of the files within the directory are used to calculate a checksum.

What algorithm is used for checksum calculation?
------------------------------------------------

* `md5`

Caveat: There are still chances for collision of hashes with `md5`, `sha256` may be better in this case, but the codebase has been using `md5` for a while and switching to `sha256` may cause regressions(?)

Success criteria for the change
-------------------------------

* `sam build` -> `sam deploy` -> `sam deploy` (Does not result in another deploy)
* `sam build` -> `sam deploy` -> `sam build` (No changes to source) -> `sam deploy` (Does not result in another deploy)

Out-of-Scope
------------

* This is a bug fix of the prior implementation.

User Experience Walkthrough
---------------------------

Implementation
==============

CLI Changes
-----------

- No changes to CLI parameters itself.

### Breaking Change

- The breaking change here is that users that relied on always `re-deploying` even with no changes to source made might be broken.

Design
------

- A new method called `dir_checksum` is written which will take a directory as input and give back a md5 checksum of all the contents within the directory.
* Goes through all subdirectories and files
* Checksums file names and contents of each file.

`samconfig.toml` Changes
----------------

None

Security
--------

**What new dependencies (libraries/cli) does this change require?**

N/A

**What other Docker container images are you using?**

N/A

**Are you creating a new HTTP endpoint? If so explain how it will be
created & used**

N/A

**Are you connecting to a remote API? If so explain how is this
connection secured**

N/A

**Are you reading/writing to a temporary folder? If so, what is this
used for and when do you clean up?**

No Temporary folders are read, but the contents of each file specified in a directory are read in order to determine md5 checksum.

**How do you validate new .samrc configuration?**

N/A

What is your Testing Plan (QA)?
===============================

Goal
----

* Integration and Unit tests pass

Pre-requesites
--------------

N/A

Test Scenarios/Cases
--------------------
* build and deploy an application, rebuild and attempt to deploy an application. The second deploy should not trigger.

Expected Results
----------------
* Scenario tests are successful


Documentation Changes
=====================

* Fixes an underlying bug, the documentation does not state that this is an issue today.

Open Issues
============

* https://github.com/awslabs/aws-sam-cli/issues/1779

Task Breakdown
==============

- \[x\] Send a Pull Request with this design document
- \[ \] Build the command line interface
- \[ \] Build the underlying library
- \[ \] Unit tests
- \[ \] Functional Tests
- \[ \] Integration tests
- \[ \] Run all tests on Windows
- \[ \] Update documentation
10 changes: 6 additions & 4 deletions samcli/lib/package/artifact_exporter.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@

from samcli.commands._utils.template import METADATA_WITH_LOCAL_PATHS, RESOURCES_WITH_LOCAL_PATHS
from samcli.commands.package import exceptions
from samcli.lib.utils.hash import dir_checksum
from samcli.yamlhelper import yaml_dump, yaml_parse


Expand Down Expand Up @@ -165,8 +166,8 @@ def resource_not_packageable(resource_dict):


def zip_and_upload(local_path, uploader):
with zip_folder(local_path) as zip_file:
return uploader.upload_with_dedup(zip_file)
with zip_folder(local_path) as (zip_file, md5_hash):
return uploader.upload_with_dedup(zip_file, precomputed_md5=md5_hash)


@contextmanager
Expand All @@ -178,11 +179,12 @@ def zip_folder(folder_path):
:param folder_path:
:return: Name of the zipfile
"""
filename = os.path.join(tempfile.gettempdir(), "data-" + uuid.uuid4().hex)
md5hash = dir_checksum(folder_path, followlinks=True)
filename = os.path.join(tempfile.gettempdir(), "data-" + md5hash)

zipfile_name = make_zip(filename, folder_path)
try:
yield zipfile_name
yield zipfile_name, md5hash
finally:
if os.path.exists(zipfile_name):
os.remove(zipfile_name)
Expand Down
27 changes: 4 additions & 23 deletions samcli/lib/package/s3_uploader.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@
from boto3.s3 import transfer

from samcli.commands.package.exceptions import NoSuchBucketError, BucketNotSpecifiedError
from samcli.lib.utils.hash import file_checksum

LOG = logging.getLogger(__name__)

Expand Down Expand Up @@ -106,20 +107,21 @@ def upload(self, file_name, remote_path):
raise NoSuchBucketError(bucket_name=self.bucket_name)
raise ex

def upload_with_dedup(self, file_name, extension=None):
def upload_with_dedup(self, file_name, extension=None, precomputed_md5=None):
"""
Makes and returns name of the S3 object based on the file's MD5 sum
:param file_name: file to upload
:param extension: String of file extension to append to the object
:param precomputed_md5: Specified md5 hash for the file to be uploaded.
:return: S3 URL of the uploaded object
"""

# This construction of remote_path is critical to preventing duplicate
# uploads of same object. Uploader will check if the file exists in S3
# and re-upload only if necessary. So the template points to same file
# in multiple places, this will upload only once
filemd5 = self.file_checksum(file_name)
filemd5 = precomputed_md5 or file_checksum(file_name)
remote_path = filemd5
if extension:
remote_path = remote_path + "." + extension
Expand Down Expand Up @@ -150,27 +152,6 @@ def make_url(self, obj_path):
raise BucketNotSpecifiedError()
return "s3://{0}/{1}".format(self.bucket_name, obj_path)

def file_checksum(self, file_name):

with open(file_name, "rb") as file_handle:
md5 = hashlib.md5()
# Read file in chunks of 4096 bytes
block_size = 4096

# Save current cursor position and reset cursor to start of file
curpos = file_handle.tell()
file_handle.seek(0)

buf = file_handle.read(block_size)
while buf:
md5.update(buf)
buf = file_handle.read(block_size)

# Restore file cursor's position
file_handle.seek(curpos)

return md5.hexdigest()

def to_path_style_s3_url(self, key, version=None):
"""
This link describes the format of Path Style URLs
Expand Down
63 changes: 63 additions & 0 deletions samcli/lib/utils/hash.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
"""
Hash calculation utilities for files and directories.
"""
import os
import hashlib

BLOCK_SIZE = 4096


def file_checksum(file_name):
"""
Parameters
----------
file_name: file name of the file for which md5 checksum is required.
Returns
-------
md5 checksum of the given file.
"""
with open(file_name, "rb") as file_handle:
md5 = hashlib.md5()

# Save current cursor position and reset cursor to start of file
curpos = file_handle.tell()
file_handle.seek(0)

buf = file_handle.read(BLOCK_SIZE)
while buf:
md5.update(buf)
buf = file_handle.read(BLOCK_SIZE)

# Restore file cursor's position
file_handle.seek(curpos)

return md5.hexdigest()


def dir_checksum(directory, followlinks=True):
"""
Parameters
----------
directory : A directory with an absolute path
followlinks: Follow symbolic links through the given directory
Returns
-------
md5 checksum of the directory.
"""
md5_dir = hashlib.md5()
# Walk through given directory and find all directories and files.
for dirpath, _, filenames in os.walk(directory, followlinks=followlinks):
# Go through every file in the directory and sub-directory.
for filepath in [os.path.join(dirpath, filename) for filename in filenames]:
# Look at filename and contents.
# Encode file's checksum to be utf-8 and bytes.
md5_dir.update(filepath.encode("utf-8"))
filepath_checksum = file_checksum(filepath)
md5_dir.update(filepath_checksum.encode("utf-8"))
return md5_dir.hexdigest()
16 changes: 10 additions & 6 deletions tests/integration/deploy/deploy_integ_base.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,6 @@
import os
import uuid
import json
import time
from pathlib import Path
from unittest import TestCase

import boto3


class DeployIntegBase(TestCase):
@classmethod
Expand Down Expand Up @@ -91,3 +85,13 @@ def get_deploy_command_list(
command_list = command_list + ["--profile", str(profile)]

return command_list

def get_minimal_build_command_list(self, template_file=None, build_dir=None):
command_list = [self.base_command(), "build"]

if template_file:
command_list = command_list + ["--template-file", str(template_file)]
if build_dir:
command_list = command_list + ["--build-dir", str(build_dir)]

return command_list
Loading

0 comments on commit bfd327f

Please sign in to comment.