-
Notifications
You must be signed in to change notification settings - Fork 457
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ECR report allowlist uploading to image-data-storage bucket #4006
Conversation
@@ -272,6 +276,29 @@ def helper_function_for_leftover_vulnerabilities_from_enhanced_scanning( | |||
LOGGER.info( | |||
f"[NonPatchableVulns] [image_uri:{ecr_enhanced_repo_uri}] {json.dumps(non_patchable_vulnerabilities.vulnerability_list, cls= test_utils.EnhancedJSONEncoder)}" | |||
) | |||
|
|||
if is_mainline_context() and not is_generic_image(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On a second thought, this helper function is also called during the build phase of the image. We want to make sure that we do not upload the data to s3 bucket in case it is during the build phase.
Env variable TEST_TYPE
is already set and exists in the environment in case it is a test CB job. You can check for the presence of this environment variable to detect if its a test phase or build phase. You can create a method for checking if its test phase over here: https://github.com/aws/deep-learning-containers/blob/master/test/test_utils/__init__.py#L681
If its a test phase, only then upload to s3, otherwise not.
image_sha = get_sha_of_an_image_from_ecr( | ||
ecr_client_for_enhanced_scanning_repo, ecr_enhanced_repo_uri | ||
) | ||
s3_resource = boto3.resource("s3") | ||
sts_client = boto3.client("sts") | ||
account_id = sts_client.get_caller_identity().get("Account") | ||
s3object = s3_resource.Object( | ||
f"image-data-storage-{account_id}", image_sha + "/ecr_allowlist.json" | ||
) | ||
s3object.put( | ||
Body=( | ||
bytes( | ||
json.dumps( | ||
allowlist_for_daily_scans.vulnerability_list, | ||
cls=test_utils.EnhancedJSONEncoder, | ||
).encode("UTF-8") | ||
) | ||
) | ||
) | ||
LOGGER.info(f"ECR allowlist uploaded to S3 Bucket") | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets create a small function within this file to handle this.
…4006) * testing build * upload allowlist for daily scans to s3 bucket * testing allowlist to s3 * fix err * change s3 upload method, fix err * change to ecr_enhanced_repo_uri * change s3 file to json type * change to building tensorflow * PT1.13ec2 build * PT1.13ec2 build * PT1.13ec2 test * change to using image-data-storage- bucket * restored the config file * add is_mainline_context check * test resolve comments * test resolve comments * resolve comments * add docstring and indent * testing build & test * testing test * restore config file * add mainline context check back --------- Co-authored-by: Anna Liu <ziqili@amazon.com>
GitHub Issue #, if available:
Note:
If merging this PR should also close the associated Issue, please also add that Issue # to the Linked Issues section on the right.
All PR's are checked weekly for staleness. This PR will be closed if not updated in 30 days.
Description
Add functionality of uploading ECR report allowlist to a s3 bucket to the test ECR scanning portion of sanity tests. The uploaded allowlist information is used for scanning dashboards to identify allowlisted vulnerabilities for ECR. Checks are added so that this is only run during the test phase and in the build pipelines.
Tests run
NOTE: By default, docker builds are disabled. In order to build your container, please update dlc_developer_config.toml and specify the framework to build in "build_frameworks"
NOTE: If you are creating a PR for a new framework version, please ensure success of the standard, rc, and efa sagemaker remote tests by updating the dlc_developer_config.toml file:
Expand
sagemaker_remote_tests = true
sagemaker_efa_tests = true
sagemaker_rc_tests = true
Additionally, please run the sagemaker local tests in at least one revision:
sagemaker_local_tests = true
Formatting
black -l 100
on my code (formatting tool: https://black.readthedocs.io/en/stable/getting_started.html)DLC image/dockerfile
Builds to Execute
Expand
Fill out the template and click the checkbox of the builds you'd like to execute
Note: Replace with <X.Y> with the major.minor framework version (i.e. 2.2) you would like to start.
build_pytorch_training_<X.Y>_sm
build_pytorch_training_<X.Y>_ec2
build_pytorch_inference_<X.Y>_sm
build_pytorch_inference_<X.Y>_ec2
build_pytorch_inference_<X.Y>_graviton
build_tensorflow_training_<X.Y>_sm
build_tensorflow_training_<X.Y>_ec2
build_tensorflow_inference_<X.Y>_sm
build_tensorflow_inference_<X.Y>_ec2
build_tensorflow_inference_<X.Y>_graviton
Additional context
PR Checklist
Expand
NEURON/GRAVITON Testing Checklist
dlc_developer_config.toml
in my PR branch by settingneuron_mode = true
orgraviton_mode = true
Benchmark Testing Checklist
dlc_developer_config.toml
in my PR branch by settingec2_benchmark_tests = true
orsagemaker_benchmark_tests = true
Pytest Marker Checklist
Expand
@pytest.mark.model("<model-type>")
to the new tests which I have added, to specify the Deep Learning model that is used in the test (use"N/A"
if the test doesn't use a model)@pytest.mark.integration("<feature-being-tested>")
to the new tests which I have added, to specify the feature that will be tested@pytest.mark.multinode(<integer-num-nodes>)
to the new tests which I have added, to specify the number of nodes used on a multi-node test@pytest.mark.processor(<"cpu"/"gpu"/"eia"/"neuron">)
to the new tests which I have added, if a test is specifically applicable to only one processor typeBy submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license. I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.