Replies: 2 comments
-
This is happening because the default pool ( |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
My problem
I'm trying to write some functional tests for a Django app that uses Celery. Those tests call celery tasks that must be asynchronous and concurrent.
In order to be more flexible and run those tests on gitlab pipelines (some times we want to have
app.conf.task_always_eager = True
in our tests to make them synchronous) each test class will in it'ssetUpClass
method start it's own celery worker withHowever, it seems that the
concurrency=2
param is useless, as it doesn't allow for any kind of concurrency.To reproduce
Install the the latest celery version (
5.2.3
).In celery_tasks.py:
In celery_config.py:
Normal way to start a task (what is expected):
python -m celery -A celery_tasks worker
Now, in the terminal which runs celery, you can see logs
We clearly see that those tasks are asynchronous and concurrent
We start our own worker:
Logs produced, in the same terminal where main.py was executed:
As you can see, even when the call were asynchronous, they are not concurrent, with both tasks executing one after the other, even with the
concurrency=2
param instart_worker
Is this a bug, or is this intended ? If this is intended, what's your suggestion to make it work. To be clear, I want those test classes to run on gitlab pipelines, have some of them behave synchronously with
app.conf.task_always_eager = True
Beta Was this translation helpful? Give feedback.
All reactions