You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm encountering a persistent NotRegistered error in my Airflow setup and have exhausted the standard debugging steps. Even a minimal test case fails, suggesting a fundamental issue with how my Celery worker is being configured by Airflow.
Minimal Test Case
To isolate the issue, I removed all other files from my dags folder, leaving only two simple files:
dags/simple_task.py
import logging
from airflow.providers.celery.executors.celery_executor import app
log = logging.getLogger(__name__)
@app.task
def my_simple_test_task(message):
"""A minimal task that only logs a message."""
log.info("SUCCESS! The simple task ran with message: %s", message)
Configuration and Debugging Steps
My airflow.cfg is configured to import this module:
airflow.cfg
[celery]
imports = simple_task
I have already tried the following steps multiple times:
Hard Resetting Services: Completely stopping the airflow scheduler and airflow celery worker processes and restarting them.
Clearing Cache: Deleting all pycache directories and .pyc files from my project.
Verifying File Location: Ensuring both simple_task.py and test_dag.py are directly inside the dags folder which is referenced in config.
The Result
When I run the minimal_celery_test DAG, the trigger_the_simple_task task sends the job, but it immediately fails (as I can see it in Flower dashboard) on the worker with the following error:
NotRegistered('simple_task.my_simple_test_task')
When I check the Celery worker's startup logs, the [tasks] section only lists the default Airflow tasks; my_simple_test_task is missing, which confirms it's not being registered.
My Question:
Given that this minimal configuration appears correct, what could be causing the Airflow Celery worker to completely ignore the [celery] imports setting in airflow.cfg? Are there any other known issues, environmental factors, or configurations specific to Airflow 3 that could lead to this behavior?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
I'm encountering a persistent NotRegistered error in my Airflow setup and have exhausted the standard debugging steps. Even a minimal test case fails, suggesting a fundamental issue with how my Celery worker is being configured by Airflow.
My Environment:
Minimal Test Case
To isolate the issue, I removed all other files from my dags folder, leaving only two simple files:
dags/simple_task.py
dags/test_dag.py
Configuration and Debugging Steps
My airflow.cfg is configured to import this module:
airflow.cfg
I have already tried the following steps multiple times:
The Result
When I run the minimal_celery_test DAG, the trigger_the_simple_task task sends the job, but it immediately fails (as I can see it in Flower dashboard) on the worker with the following error:
NotRegistered('simple_task.my_simple_test_task')
When I check the Celery worker's startup logs, the
[tasks]
section only lists the default Airflow tasks;my_simple_test_task
is missing, which confirms it's not being registered.My Question:
Given that this minimal configuration appears correct, what could be causing the Airflow Celery worker to completely ignore the [celery] imports setting in airflow.cfg? Are there any other known issues, environmental factors, or configurations specific to Airflow 3 that could lead to this behavior?
Beta Was this translation helpful? Give feedback.
All reactions