Skip to content

Conversation

@amoghrajesh
Copy link
Contributor

@amoghrajesh amoghrajesh commented Jan 20, 2026


Was generative AI tooling used to co-author this PR?
  • Yes (please specify the tool below)
    Used Cursor with Claude Sonnet 4.5, mostly for tests

Problem

task-sdk had several imports from airflow-core for remote logging, for instance

  • from airflow.logging_config import RemoteLogIO, get_remote_task_log, get_default_remote_conn_id
  • from airflow.configuration import conf (used atleast 4 times in log.py)

Due to this coupling, client-server separation is difficult where task sdk dfoesn't import from core airflow.

Proposal

To the existing shared library (logging), I am adding utilities containing the remote logging protocols and discovery logic. Both core and sdk now use this shared code, but each uses its own configuration source (conf)

For context as to what's being moved:

  • The RemoteLogIO and RemoteLogStreamIO protocols define the interface that S3, GCS, CloudWatch and other remote log handlers implement. These protocols are now in the shared library where both core and sdk (and providers) can reference them.

  • The discovery logic that found and was responsible for loading the remote log handler from the logging config module is extracted into a helper discover_remote_log_handler() function. This function takes the config paths and import function as parameters, so core and sdk can each inject their own config.

Summary of changes

Airflow core's: airflow/logging/remote.py is just a backcompat shim now that re-exports from the shared library. The logging_config.py module uses the shared discovery function instead of it's earlier logic.

For task-sdk, sdk/log.py gets its own _ActiveLoggingConfig class and discovery logic exports. All imports from core are replaced with sdk or shared imports. The earlier TODO's are removed too.

Impact on remote logging

No breaking changes.

  • Provider remote log handlers work unchanged, same configuration mechanism, same connection ID options.
  • The difference is internal - task processes now use sdk's config instead of core's/

For confidence, entire testing flow is here:

  1. Run breeze with localstack integration after setting these env vars:
export AIRFLOW__LOGGING__REMOTE_LOGGING=true
export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=s3://test-airflow-logs
export AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=aws_default
export AIRFLOW__LOGGING__DELETE_LOCAL_LOGS=false
export AIRFLOW_CONN_AWS_DEFAULT=aws://test:test@?endpoint_url=http://localstack:4566&region_name=us-east-1

breeze start-airflow --integration localstack

  1. Create a simple dag like this one:
from datetime import datetime
from airflow.sdk import DAG
from airflow.providers.standard.operators.python import PythonOperator


def test_logging():
    import logging
    logger = logging.getLogger(__name__)

    logger.info("Testing remote logging with LocalStack")
    logger.warning("This should appear in S3")

    for i in range(5):
        logger.info(f"Log message {i}")

    print("Print statement test")


with DAG(
    "test_remote_logging",
    start_date=datetime(2024, 1, 1),
    schedule=None,
    catchup=False,
) as dag:
    PythonOperator(
        task_id="log_test",
        python_callable=test_logging,
    )
  1. Trigger the dag and observe the logs
image
  1. Check in localstack logs for requests:
image
  1. Check logs on localstack container using awslocal too
image

NOTE:

While I was at it, I realised that sdk conf didn't have support for expansion variables like AIRFLOW_HOME leading to remote log tests failing with:

Failed: Expected at least 6 log files in S3 bucket test-airflow-logs, but found 0 objects: []

This was because AIRFLOW_HOME was not being expanded:

DEBUG upload_to_remote(): raw_logger=<BytesLogger(file=<_io.BufferedWriter name='{AIRFLOW_HOME}/logs/dag_id=example_xcom_test/...'>>

Fixed in 43319c3

  • Read the Pull Request Guidelines for more information. Note: commit author/co-author name and email in commits become permanently public when merged.
  • For fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
  • When adding dependency, check compliance with the ASF 3rd Party License Policy.
  • For significant user-facing changes create newsfragment: {pr_number}.significant.rst or {issue_number}.significant.rst, in airflow-core/newsfragments.

Copy link
Member

@jason810496 jason810496 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! LGTM overall.

Copy link
Member

@potiuk potiuk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@potiuk
Copy link
Member

potiuk commented Jan 20, 2026

With tests passing of course and the few comments by @jason810496

@amoghrajesh
Copy link
Contributor Author

Sorry for the noise folks! Bad rebase

@amoghrajesh
Copy link
Contributor Author

@jason810496 are you OK with the changes here?

Copy link
Member

@jason810496 jason810496 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jason810496 are you OK with the changes here?

Yeah, LGTM once the CI pass.

@amoghrajesh
Copy link
Contributor Author

CI is finally green, merging this.

@amoghrajesh amoghrajesh merged commit 26c8c9c into apache:main Jan 23, 2026
247 of 248 checks passed
@amoghrajesh amoghrajesh deleted the decouple-remote-log-from-core branch January 23, 2026 04:55
@jason810496
Copy link
Member

CI is finally green, merging this.

Nice! Thanks Amogh.

suii2210 pushed a commit to suii2210/airflow that referenced this pull request Jan 26, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants