Skip to content

Commit

Permalink
Add task context logging support for remote AWS S3 logging
Browse files Browse the repository at this point in the history
With the addition of taxt context logging feature in PR apache#32646,
this PR extends the feature to AWS S3 when is it set as remote
logging store. Here, backward compatibility is ensured for older
versions of Airflow that do not have the feature included in
Airflow Core.
  • Loading branch information
pankajkoti committed Nov 17, 2023
1 parent 0c6fd5b commit 9e25b81
Showing 1 changed file with 6 additions and 3 deletions.
9 changes: 6 additions & 3 deletions airflow/providers/amazon/aws/log/s3_task_handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,14 +71,17 @@ def hook(self):
aws_conn_id=conf.get("logging", "REMOTE_LOG_CONN_ID"), transfer_config_args={"use_threads": False}
)

def set_context(self, ti):
super().set_context(ti)
def set_context(self, ti, identifier=None):
if getattr(self, "supports_task_context_logging", False):
super().set_context(ti, identifier=identifier)
else:
super().set_context(ti)
# Local location and remote location is needed to open and
# upload local log file to S3 remote storage.
full_path = self.handler.baseFilename
self.log_relative_path = pathlib.Path(full_path).relative_to(self.local_base).as_posix()
is_trigger_log_context = getattr(ti, "is_trigger_log_context", False)
self.upload_on_close = is_trigger_log_context or not ti.raw
self.upload_on_close = is_trigger_log_context or not getattr(ti, "raw", None)
# Clear the file first so that duplicate data is not uploaded
# when re-using the same path (e.g. with rescheduled sensors)
if self.upload_on_close:
Expand Down

0 comments on commit 9e25b81

Please sign in to comment.