-
Notifications
You must be signed in to change notification settings - Fork 474
Open
Description
when i have a periodic task with expires field set to specific datetime, the task still gets scheduled after the defined expiry time to the queue but celery while executing it marks it as revoked.
I am trying to understand, if this is an expected behaviour or an issue.
Is there any workaround to not schedule the expired task? I am guessing this can cause issues, in case i have too many schedules which have expired.
Adding some information below around my implementation and Logs.
Sample Schedule Creation
schedule, _ = CrontabSchedule.objects.get_or_create(
minute=schedule_minute,
hour=schedule_hour,
day_of_week='*',
day_of_month='*',
month_of_year='*',
)
periodic_task = PeriodicTask.objects.create(
crontab=schedule,
name=f'{task_name}_{uuid.uuid4()}',
task=f'{app_name}.task.{task_name}',
args=json.dumps(arguments),
start_time=schedule_start,
expires=schedule_end,
one_off=is_one_of_task,
)
Logs from beats
<TASK ID> has an expiration date in the past
We assume this is intended and so we have set the expiration date to 0 instead.
According to RabbitMQ's documentation:
"Setting the TTL to 0 causes messages to be expired upon reaching a queue unless they can be delivered to a consumer immediately."
Logs from Celery
Task task.test_task[91a3e09d-40ab-493f-a649-7ec3104c0f77] received
Discarding revoked task: task.test_task[91a3e09d-40ab-493f-a649-7ec3104c0f77]
Version Info
Django 3.2.12
django-celery 3.1.17
django-celery-beat 2.7.0
django-celery-results 2.5.0
Metadata
Metadata
Assignees
Labels
No labels