-
Notifications
You must be signed in to change notification settings - Fork 16.3k
Closed
Labels
Description
Apache Airflow version: 2.0.1
Environment:
- Cloud provider or hardware configuration: on my laptop
- OS (e.g. from /etc/os-release): MacOS Majave 10.14.6
- Kernel (e.g.
uname -a): Darwin Wongs-MBP 18.7.0 Darwin Kernel Version 18.7.0: Tue Jan 12 22:04:47 PST 2021; root:xnu-4903.278.56~1/RELEASE_X86_64 x86_64
What happened:
configured remote logging to S3 bucket, only the logs of DAG runs appeared in the bucket.
logs of airflow server components: scheduler, web server, etc did not appear
What you expected to happen:
all logs go to S3 bucket
How to reproduce it:
-
follow the quick start guide in https://airflow.apache.org/docs/apache-airflow/stable/start/local.html
-
before starting web server set the following variables:
export AIRFLOW__LOGGING__REMOTE_LOGGING=True
export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=s3://my-bucket/
export AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=my_remote_logging_conn_id- start the web server and set your S3 connection settings in the web server "connections" section.
Conn Id * my_remote_logging_conn_id
Conn Type S3
Extra {"region_name": "nyc3",
"host": "https://nyc3.digitaloceanspaces.com",
"aws_access_key_id": "xxx",
"aws_secret_access_key": "xxx"}
- Restart the web server
- Start the scheduler in another console window (setting the same env variables)
- Execute a DAG
- Head to your S3 bucket UI, you will see only logs of DAG runs appear.