Skip to content

Fix BigQueryInsertJobOperator error handling in deferrable mode #46160

@Mohammed-Karim226

Description

@Mohammed-Karim226

Apache Airflow version

main (development)

If "Other Airflow 2 version" selected, which one?

I am using Apache Airflow version 2.6.2.

What happened?

When using the BigQueryInsertJobOperator in deferrable mode, if the job fails without being deferred, the task does not raise an exception or fail as expected. Instead, it continues execution without proper error handling, leading to incorrect task states and potential data inconsistencies.

What you think should happen instead?

The task should raise an exception and fail immediately if the job encounters an error in non-deferred mode, ensuring proper error handling and task state management.

How to reproduce

  1. Set up a DAG with the BigQueryInsertJobOperator in deferrable mode.
  2. Configure the operator to execute a job that will fail (invalid SQL query).
  3. Trigger the DAG and observe the task behavior.

Operating System

Ubuntu 20.04 LTS

Versions of Apache Airflow Providers

8.10.0

Deployment

Docker-Compose

Deployment details

  • Airflow version: 2.6.2
  • Docker Compose version: 2.17.2
  • Kubernetes version: N/A
  • Custom configurations: None

Anything else?

This issue occurs every time a non-deferred job fails in the BigQueryInsertJobOperator. Below are the relevant logs:

log
[2025-01-28T12:34:56.789Z] ERROR - Task failed but did not raise an exception. Task state: SUCCESS

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions