Both Snowflake and Postgresql Don't Commit Database Changes. #34468
-
|
I'm trying to get a feel for using Airflow. Running version 2.7.1 in standalone, sqlite backend, and nothing fancy in the config file. Running on a laptop, with postgresql running on the same laptop. I checked the auth credentials. They work. I have a very simple task, open a file with an UPDATE query in it, run the query. Task returns success, but, the column/row isn't updated. Same behavior for snowflake. I can even append a COMMIT as a second query and still the UPDATE shows as successful, yet the row is not changed. Is there something in setting up the connection (pg_for_me) I am missing? Here's what's in the flat file:
Here's the code for the Postgresql job. Here's the logging for the task: |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 6 replies
-
Just wondering are you trying to run Operator inside of PythonOperator/task flow ? |
Beta Was this translation helpful? Give feedback.
PostgresHook actually performs the update.
SQLExecuteQueryOperator does not notify you in any way that an update fails. Despite the existence of an autocommit option, it doesn't commit and returns success. I am very much out of my comfort zone to submit a bug report on this because I know so little. So, here it stays.