-
Notifications
You must be signed in to change notification settings - Fork 16.5k
Closed
Labels
area:corekind:bugThis is a clearly a bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetlabel for new issues that we didn't triage yetprovider:celery
Description
Apache Airflow version
2.10.2
If "Other Airflow 2 version" selected, which one?
No response
What happened?
I'm currently using an airflow instance setup with CeleryExecutor.
I have a task assigned to a "heavy" queue:
@task.virtualenv(
trigger_rule=TriggerRule.NONE_FAILED_MIN_ONE_SUCCESS,
requirements=[
"my_package=1.1.1"
],
index_urls=["my_index_url"],
venv_cache_path="/tmp",
queue="heavy"
)
def my_task(arg1, arg2, params=None):
...but for some reason the task is running on a worker configured to only handle the default queue?
-> IMPORTANT: The DAG Run existed prior to the addition of the queue="heavy" to the dag's task. And the Task state was cleared in order to restart the task.
What you think should happen instead?
Tasks assigned to a specific queue should only run in workers assigned to that specific queue.
How to reproduce
Not sure how.
Operating System
Ubuntu 22
Versions of Apache Airflow Providers
No response
Deployment
Docker-Compose
Deployment details
CeleryExecutor
2 workers:
celery worker --concurrency ${DEFAULT_WORKER_CONCURRENCY}
celery worker -q heavy --concurrency ${HEAVY_WORKER_CONCURRENCY}
Anything else?
-> IMPORTANT: The DAG Run existed prior to the addition of the queue="heavy" to the dag's task. And the Task state was cleared in order to restart the task.
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
area:corekind:bugThis is a clearly a bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetlabel for new issues that we didn't triage yetprovider:celery

