Apache Airflow version
2.2.3 (latest released)
What happened
In the configuration file, we specified a sql_alchemy_conn_cmd so that the main sql alchemy connection is loaded from a command. However, when using such a configuration and running any dag / task, we get the following error:
[2022-02-16, 06:14:09 UTC] {base_task_runner.py:117} INFO - Job 366: Subtask greet Traceback (most recent call last):
[2022-02-16, 06:14:09 UTC] {base_task_runner.py:117} INFO - Job 366: Subtask greet File "/u/peep/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
[2022-02-16, 06:14:09 UTC] {base_task_runner.py:117} INFO - Job 366: Subtask greet self.dialect.do_execute(
[2022-02-16, 06:14:09 UTC] {base_task_runner.py:117} INFO - Job 366: Subtask greet File "/u/peep/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
[2022-02-16, 06:14:09 UTC] {base_task_runner.py:117} INFO - Job 366: Subtask greet cursor.execute(statement, parameters)
[2022-02-16, 06:14:09 UTC] {base_task_runner.py:117} INFO - Job 366: Subtask greet sqlite3.OperationalError: no such table: dag_run
What you expected to happen
Tasks should run without error when using sql_alchemy_conn_cmd
How to reproduce
Use a sql_alchemy_conn_cmd file, and any task that you run will try to use sqlite instead of the configured database connection and cause all sorts of havoc.
Operating System
focal 20.04.3
Versions of Apache Airflow Providers
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-docker==2.3.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-jdbc==2.0.1
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-odbc==2.0.1
apache-airflow-providers-oracle==2.0.1
apache-airflow-providers-redis==2.0.1
apache-airflow-providers-slack==4.1.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-vertica==2.0.1```
### Deployment
Virtualenv installation
### Deployment details
apache installed on a four ec2 instances in aws (2 master, 2 worker).
### Anything else
This problem occurs because of commit a90878cf660ffe73973b4e4487c1e691cc212925 where the `include_cmds` flag is set to `False`.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
Apache Airflow version
2.2.3 (latest released)
What happened
In the configuration file, we specified a
sql_alchemy_conn_cmdso that the main sql alchemy connection is loaded from a command. However, when using such a configuration and running any dag / task, we get the following error:What you expected to happen
Tasks should run without error when using
sql_alchemy_conn_cmdHow to reproduce
Use a sql_alchemy_conn_cmd file, and any task that you run will try to use sqlite instead of the configured database connection and cause all sorts of havoc.
Operating System
focal 20.04.3
Versions of Apache Airflow Providers