Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion airflow/providers/amazon/aws/hooks/s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -220,7 +220,7 @@ def check_for_bucket(self, bucket_name: str | None = None) -> bool:
# https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.head_bucket
return_code = int(e.response["Error"]["Code"])
if return_code == 404:
self.log.error('Bucket "%s" does not exist', bucket_name)
self.log.info('Bucket "%s" does not exist', bucket_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we change log level only for 404?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well. If it's 403 the bucket exists but the user does not have permissions so I think it makes sense to log an error for that use case

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On the other side, 404 is a very valid use case. If the bucket exists (and user has permissions), we should not log an error for that

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@eladkal WDYT?

elif return_code == 403:
self.log.error(
'Access to bucket "%s" is forbidden or there was an error with the request', bucket_name
Expand Down
5 changes: 4 additions & 1 deletion tests/system/providers/amazon/aws/example_emr_serverless.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@

from datetime import datetime

import boto3

from airflow.models.baseoperator import chain
from airflow.models.dag import DAG
from airflow.providers.amazon.aws.operators.emr import (
Expand Down Expand Up @@ -49,7 +51,8 @@
env_id = test_context[ENV_ID_KEY]
role_arn = test_context[ROLE_ARN_KEY]
bucket_name = f"{env_id}-emr-serverless-bucket"
entryPoint = "s3://us-east-1.elasticmapreduce/emr-containers/samples/wordcount/scripts/wordcount.py"
region = boto3.session.Session().region_name
entryPoint = f"s3://{region}.elasticmapreduce/emr-containers/samples/wordcount/scripts/wordcount.py"
create_s3_bucket = S3CreateBucketOperator(task_id="create_s3_bucket", bucket_name=bucket_name)

SPARK_JOB_DRIVER = {
Expand Down