Skip to content

Airflow Bug Report #45905

@Abhimanuchauhan01

Description

@Abhimanuchauhan01

Apache Airflow version

2.10.4

If "Other Airflow 2 version" selected, which one?

Airflow Bug Report

What happened?

While running a DAG with a task configured to have max_retries=3, the task did not stop retrying after 3 failed attempts. Instead, it kept retrying indefinitely, ignoring the configured limit.

Context in which the problem occurred:
I encountered this issue after upgrading Apache Airflow from version 2.7.2 to 2.7.3. The DAG is set up to use the CeleryExecutor, and the task in question is a PythonOperator that calls a simple Python function. The problem is reproducible every time the task fails.

What you think should happen instead?

The task should fail permanently after exhausting the configured max_retries=3 attempts. The system should respect the max_retries parameter and not continue retrying indefinitely.

What went wrong?
It appears that the max_retries parameter is either being ignored or overridden. This may be due to a regression introduced in the upgrade from version 2.7.2 to 2.7.3, or it could be related to the CeleryExecutor not correctly interpreting the max_retries setting.

How to reproduce

Steps to Reproduce the Problem:

Environment Setup:

Airflow Version: 2.7.3
Executor: CeleryExecutor
Scheduler: Default Airflow Scheduler
Python Version: 3.10
Database: PostgreSQL 13
Operating System: Ubuntu 22.04
DAG Configuration:
Create a minimal DAG with the following structure:

python
Copy
Edit
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime

def fail_task():
raise ValueError("This task is designed to fail.")

with DAG(
dag_id='retry_issue_demo',
start_date=datetime(2025, 1, 1),
schedule_interval=None,
catchup=False,
) as dag:
failing_task = PythonOperator(
task_id='failing_task',
python_callable=fail_task,
retries=3, # Explicitly setting max_retries
)
Execution:

Deploy the DAG to your Airflow environment.
Trigger the DAG manually via the Airflow UI or CLI.
Observation:

Monitor the task failing_task in the Airflow UI.
Observe the task retry behavior and note whether it respects the configured retries=3.
Expected Behavior:
The task should fail permanently after three retry attempts.

Actual Behavior:
The task continues to retry indefinitely, disregarding the retries parameter.

Operating System

Ubuntu 22.04 LTS

Versions of Apache Airflow Providers

apache-airflow-providers-google==7.0.0
apache-airflow-providers-amazon==3.0.0
apache-airflow-providers-slack==2.0.0

Deployment

Other

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions