-
Notifications
You must be signed in to change notification settings - Fork 16.6k
Description
Apache Airflow version
Other Airflow 2 version (please specify below)
What happened
Hi Team,
Currently, I am running docker-compose using the airflow:2.5.3-python3.8 image. Currently, when I go running my dag everything looks good except I am getting these ugly asterisks everywhere the word 'airflow' is supposed to be. I double-checked my config and paths and still have the same outcome. Please let me know if there is something I may be doing incorrectly.
here are the logs below
*** Reading local file: /opt/airflow/logs/dag_id=heartbeat/run_id=manual__2023-06-27T00:25:10.808687+00:00/task_id=heartbeat/attempt=1.log
[2023-06-26, 20:25:22 EDT] {taskinstance.py:1090} INFO - Dependencies all met for dep_context=non-requeueable deps ti=<TaskInstance: heartbeat.heartbeat manual__2023-06-27T00:25:10.808687+00:00 [queued]>
[2023-06-26, 20:25:22 EDT] {taskinstance.py:1090} INFO - Dependencies all met for dep_context=requeueable deps ti=<TaskInstance: heartbeat.heartbeat manual__2023-06-27T00:25:10.808687+00:00 [queued]>
[2023-06-26, 20:25:22 EDT] {taskinstance.py:1288} INFO -
--------------------------------------------------------------------------------
[2023-06-26, 20:25:22 EDT] {taskinstance.py:1289} INFO - Starting attempt 1 of 1
[2023-06-26, 20:25:22 EDT] {taskinstance.py:1290} INFO -
--------------------------------------------------------------------------------
[2023-06-26, 20:25:22 EDT] {taskinstance.py:1309} INFO - Executing <Task(PipelineOperator): heartbeat> on 2023-06-27 00:25:10.808687+00:00
[2023-06-26, 20:25:22 EDT] {standard_task_runner.py:55} INFO - Started process 21159 to run task
[2023-06-26, 20:25:22 EDT] {standard_task_runner.py:82} INFO - Running: ['***', 'tasks', 'run', 'heartbeat', 'heartbeat', 'manual__2023-06-27T00:25:10.808687+00:00', '--job-id', '26', '--raw', '--subdir', 'DAGS_FOLDER/heartbeat.py', '--cfg-path', '/tmp/tmpp2vbdho7']
[2023-06-26, 20:25:22 EDT] {standard_task_runner.py:83} INFO - Job 26: Subtask heartbeat
[2023-06-26, 20:25:23 EDT] {task_command.py:389} INFO - Running <TaskInstance: heartbeat.heartbeat manual__2023-06-27T00:25:10.808687+00:00 [running]> on host c8908875256e
[2023-06-26, 20:25:23 EDT] {taskinstance.py:1516} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=heartbeat
AIRFLOW_CTX_TASK_ID=heartbeat
AIRFLOW_CTX_EXECUTION_DATE=2023-06-27T00:25:10.808687+00:00
AIRFLOW_CTX_TRY_NUMBER=1
AIRFLOW_CTX_DAG_RUN_ID=manual__2023-06-27T00:25:10.808687+00:00
[2023-06-26, 20:25:23 EDT] {operators.py:514} INFO - Using common lib version [1.1.91].
[2023-06-26, 20:25:23 EDT] {operators.py:171} INFO - Building end_interval using data_interval_end: 2023-06-27T00:25:00+00:00, data_interval_type : SINCE_LAST_SUCCESS
[2023-06-26, 20:25:23 EDT] {operators.py:179} INFO - end_interval = 2023-06-27T00:25:00+00:00
[2023-06-26, 20:25:23 EDT] {logging_mixin.py:137} INFO - Run id for the Lineage API is being set to: 88562ae5-3511-37a6-a361-59ebb4b3e111
[2023-06-26, 20:25:23 EDT] {operators.py:139} INFO - Building start_interval using data_interval_start: 2023-06-27T00:20:00+00:00, prev_data_interval_end_success : None,data_interval_type : SINCE_LAST_SUCCESS
[2023-06-26, 20:25:23 EDT] {operators.py:166} INFO - start_interval = 2023-06-27T00:20:00+00:00
[2023-06-26, 20:25:23 EDT] {operators.py:171} INFO - Building end_interval using data_interval_end: 2023-06-27T00:25:00+00:00, data_interval_type : SINCE_LAST_SUCCESS
[2023-06-26, 20:25:23 EDT] {operators.py:179} INFO - end_interval = 2023-06-27T00:25:00+00:00
[2023-06-26, 20:25:23 EDT] {operators.py:611} INFO - Bash command configuration: /opt/***/run-pipeline-shared.sh general heartbeat/heartbeat.py
All of my other airflow servers don't have this issue, and I have checked to ensure that the sensitive_var_conn_names = False and hide_sensitive_var_conn_fields = False Not sure what else to do at this point in time....
What you think should happen instead
Instead, the *** should be shown as airflow
How to reproduce
Utilizing docker-compose and the airflow:2.5.3-python3.8 image, essentially use any basic dag and have airflow pick up on it then go to logs and see the asterisks
Operating System
Debian GNU/Linux 11 (bullseye)
Versions of Apache Airflow Providers
No response
Deployment
Docker-Compose
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct