Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a Task class specialised for Django #8491

Merged
merged 22 commits into from Jan 29, 2024
Merged

Conversation

browniebroke
Copy link
Contributor

@browniebroke browniebroke commented Sep 8, 2023

Note: Before submitting this pull request, please review our contributing
guidelines
.

Description

Add a task specialised to use with Django. Offers an API to call .delay() on tasks at the end of the database transaction.

  • Add task to the codebase
  • Add docs
  • Add tests

Rationale

A common pitfall using Celery with Django is to call .delay() in a middle of a database transaction and have the task start executing before the transaction is committed. I think it's partly due to the fact that doing the call properly is a bit cumbersome.

This PR adds a specialised task that aim at offering a nicer API for it. I've added the initial code and some docs to explain how it would be used, but before going further, I'd like to know whether this would be a welcome addition or if it's considered too out of scope.

celery/contrib/django/task.py Outdated Show resolved Hide resolved
docs/django/first-steps-with-django.rst Outdated Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Sep 8, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (86895a9) 81.24% compared to head (30be3ec) 81.25%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #8491      +/-   ##
==========================================
+ Coverage   81.24%   81.25%   +0.01%     
==========================================
  Files         148      149       +1     
  Lines       18542    18553      +11     
  Branches     3165     3166       +1     
==========================================
+ Hits        15064    15075      +11     
  Misses       3191     3191              
  Partials      287      287              
Flag Coverage Δ
unittests 81.23% <100.00%> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@auvipy auvipy self-requested a review September 9, 2023 06:57
@browniebroke browniebroke marked this pull request as ready for review September 9, 2023 14:12
celery/fixups/django.py Outdated Show resolved Hide resolved
celery/contrib/django/task.py Outdated Show resolved Hide resolved
celery/fixups/django.py Outdated Show resolved Hide resolved
celery/contrib/django/task.py Outdated Show resolved Hide resolved
celery/contrib/django/task.py Outdated Show resolved Hide resolved
@auvipy auvipy added this to the 5.3.x milestone Sep 17, 2023
@Nusnus Nusnus self-requested a review September 18, 2023 20:45
@auvipy auvipy self-requested a review September 19, 2023 08:06
@thedrow
Copy link
Member

thedrow commented Sep 20, 2023

"A common pitfall using Celery with Django is to call .delay() in a middle of a database transaction and have the task start executing before the transaction is committed. I think it's partly due to the fact that doing the call properly is a bit cumbersome."
I'm not sure I'm following here.
If you delay a task in the middle of a database transaction that may or may not be desired.
A delayed task does not participate in the same transaction as the producer so why should we commit before delaying?

celery/contrib/django/task.py Outdated Show resolved Hide resolved
celery/contrib/django/task.py Outdated Show resolved Hide resolved
browniebroke and others added 2 commits September 22, 2023 16:34
@Nusnus
Copy link
Member

Nusnus commented Sep 25, 2023

@browniebroke hey there - thanks for this contribution, I'm sure it will be very useful!

Do you think it's possible to also refine the django example or add another one for the new feature?

I'm positive it will increase adoption and possibly awareness of the new feature.

@browniebroke
Copy link
Contributor Author

Do you think it's possible to also refine the django example or add another one for the new feature?

Absolutely! I actually saw the example at some point during implementation and made a mental note to look at it later, but I (obviously) forgot about it 😄

Thanks for bringing it up.

@Nusnus
Copy link
Member

Nusnus commented Sep 25, 2023

Do you think it's possible to also refine the django example or add another one for the new feature?

Absolutely! I actually saw the example at some point during implementation and made a mental note to look at it later, but I (obviously) forgot about it 😄

Thanks for bringing it up.

Awesome!
Thank you 👑

@browniebroke
Copy link
Contributor Author

browniebroke commented Sep 25, 2023

Do you think it's possible to also refine the django example or add another one for the new feature?

I looked at the example project and it was actually not making much use of the delay() API, except in the readme. I've updated it, but I'm thinking that maybe we might want to leave the existing example and write a more relevant example, with a DB transaction. I see that there is a model, so maybe I could make use of that?

@auvipy auvipy self-requested a review September 27, 2023 08:41
@auvipy
Copy link
Member

auvipy commented Oct 17, 2023

Do you think it's possible to also refine the django example or add another one for the new feature?

I looked at the example project and it was actually not making much use of the delay() API, except in the readme. I've updated it, but I'm thinking that maybe we might want to leave the existing example and write a more relevant example, with a DB transaction. I see that there is a model, so maybe I could make use of that?

yes you can go for it. use that model and use a more nice example....

@auvipy auvipy modified the milestones: 5.3.x, 5.4 Nov 8, 2023
@cclauss
Copy link
Contributor

cclauss commented Jan 5, 2024

Nice!! Perhaps this needs revisiting now that Django v5.0.1 is shipping.

Copy link
Member

@auvipy auvipy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you plan to add anything more to this PR?

@browniebroke
Copy link
Contributor Author

No, I don't think so.

@auvipy
Copy link
Member

auvipy commented Jan 5, 2024

I will wait for other team mates opinion before merge

@Nusnus
Copy link
Member

Nusnus commented Jan 5, 2024

I will wait for other team mates opinion before merge

I'll check with @thedrow if he's available for another review. Will do myself as well.

@auvipy auvipy merged commit da1146a into celery:main Jan 29, 2024
81 of 82 checks passed
@browniebroke browniebroke deleted the django-task branch January 29, 2024 17:23
@auvipy
Copy link
Member

auvipy commented Jan 30, 2024

No, I don't think so.

this failed only after merged. it was OK during last run in the PR

tox: 3.9-smoke
3.9-smoke: 39864 W commands[0]> pytest -xsv t/smoke --dist=loadscope --reruns 10 --reruns-delay 60 --rerun-except AssertionError -n auto -k test_tasks.py [tox/tox_env/api.py:427]
============================= test session starts ==============================
platform linux -- Python 3.9.18, pytest-7.4.4, pluggy-1.4.0 -- /home/runner/work/celery/celery/.tox/3.9-smoke/bin/python
cachedir: .tox/3.9-smoke/.pytest_cache
rootdir: /home/runner/work/celery/celery
configfile: pyproject.toml
plugins: order-1.2.0, cov-4.1.0, timeout-2.2.0, docker-tools-3.1.3, subtests-0.11.0, click-1.1.0, xdist-3.5.0, github-actions-annotate-failures-0.2.0, celery-1.0.0b1, rerunfailures-13.0
created: 2/2 workers
2 workers [18 items]

scheduling tests via LoadScopeScheduling

t/smoke/tests/test_tasks.py::test_replace::test_sanity[celery_setup_worker-celery_redis_broker-celery_redis_backend]
t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_redis_broker-celery_redis_backend-Method.SIGKILL-WorkerLostError]
[gw0] PASSED t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_redis_broker-celery_redis_backend-Method.SIGKILL-WorkerLostError]
t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_redis_broker-celery_redis_backend-Method.SYSTEM_EXIT-WorkerLostError]
[gw0] PASSED t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_redis_broker-celery_redis_backend-Method.SYSTEM_EXIT-WorkerLostError]
t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_redis_broker-celery_redis_backend-Method.DELAY_TIMEOUT-TimeLimitExceeded]
[gw0] PASSED t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_redis_broker-celery_redis_backend-Method.DELAY_TIMEOUT-TimeLimitExceeded]
t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_redis_broker-celery_redis_backend-Method.EXHAUST_MEMORY-WorkerLostError]
[gw1] PASSED t/smoke/tests/test_tasks.py::test_replace::test_sanity[celery_setup_worker-celery_redis_broker-celery_redis_backend]
t/smoke/tests/test_tasks.py::test_replace::test_sanity[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend]
[gw0] PASSED t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_redis_broker-celery_redis_backend-Method.EXHAUST_MEMORY-WorkerLostError]
t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend-Method.SIGKILL-WorkerLostError]
[gw0] PASSED t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend-Method.SIGKILL-WorkerLostError]
t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend-Method.SYSTEM_EXIT-WorkerLostError]
[gw1] PASSED t/smoke/tests/test_tasks.py::test_replace::test_sanity[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend]
[gw0] PASSED t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend-Method.SYSTEM_EXIT-WorkerLostError]
t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend-Method.DELAY_TIMEOUT-TimeLimitExceeded]
[gw0] PASSED t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend-Method.DELAY_TIMEOUT-TimeLimitExceeded]
t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend-Method.EXHAUST_MEMORY-WorkerLostError]
[gw0] PASSED t/smoke/tests/test_tasks.py::test_task_termination::test_child_process_respawn[celery_setup_worker-celery_rabbitmq_broker-celery_redis_backend-Method.EXHAUST_MEMORY-WorkerLostError]
Error: The action has timed out.

@browniebroke
Copy link
Contributor Author

browniebroke commented Jan 30, 2024

I saw the error and was wondering if it was some flakyness? Seems like the smoke tasks tests worked on other python versions... 🤔

Do we know where this comes from? Error: The action has timed out.

@Nusnus
Copy link
Member

Nusnus commented Jan 30, 2024

I saw the error and was wondering if it was some flakyness? Seems like the smoke tests worked on other python versions... 🤔

Do we know where this comes from? Error: The action has timed out.

Checkout the latest run. It was indeed a hiccup. I already restarted the CI yesterday :)
It passes 100%.

@Nusnus
Copy link
Member

Nusnus commented Jan 30, 2024

I saw the error and was wondering if it was some flakyness? Seems like the smoke tests worked on other python versions... 🤔
Do we know where this comes from? Error: The action has timed out.

Checkout the latest run. It was indeed a hiccup. I already restarted the CI yesterday :) It passes 100%.

hiccup a.k.a Redis container hiccups causes Celery worker to hang -> CI reaches timeout.

@faradayyg
Copy link

Thanks for this improvement!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

None yet

8 participants