-
-
Notifications
You must be signed in to change notification settings - Fork 40
Closed
Milestone
Description
In order to copy the dbt logs over to the Airflow logger we use a bit of a hack: we force dbt to log to a temporary file and then write the contents of that file to the Airflow logger. This is similar to what Airflow's BashOperator does as it re-reads from the subprocess module: https://github.com/apache/airflow/blob/v1-10-stable/airflow/operators/bash_operator.py#L155
Unfortunately, since dbt logs quite a bit, this may lead the worker to hit its memory limit as the log file grows waiting to be read at the end. We should properly override the dbt logger with Airflow's logger so that we can directly log as we go, instead of having to rely on a file.
Metadata
Metadata
Assignees
Labels
No labels