-
Notifications
You must be signed in to change notification settings - Fork 16.5k
Description
Apache Airflow version
Other Airflow 2 version (please specify below)
What happened
Hey Team,
I am not sure where I am doing wrong. I am running a DAG and I have set retries to 5. My expectation is that, If a task got failed, it will retry 5 times and if it doesn't get pass it will mark it as failed but contrary to that, the task marked as success and triggered the upstream task, despite it has not done all the updates required in that particular task to get complete. Am I missing any required parameter? Thanks.
Note- I am using dataProc cluster and updating records in BQ table and it has limitation of 1500 updates/table/day. That limit got exceeded, but airflow didn't bother to mark it failed, rather it has triggered upstream task. Any fix which we can use? so that the task would have failed even if there is limit/quota exceed issue?
Error- 403 BQ limit exceeded.
What you think should happen instead
After limit got exceeded. It should mark the task failed.
How to reproduce
You can run a BQ task on dataproc cluster, set retry to 5 and make updates in BQ table. BQ limitation is 1500 updates/table/day. Try to make around 2000 updates in a single table/day. And see if that task got failed or is it skip that and pass on upstream task. (Do check your BQ table that all your updated records are there or not).
Please let us know what we can do to handle such scenario?
Operating System
windows
Versions of Apache Airflow Providers
No response
Deployment
Composer
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct