You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed (or failed to) that there's no way to prevent duplicate jobs from being queued up. I was wondering whether this feature would be pulled in if I was to write it. Specifically, I would only be developing it for the ActiveRecord backend: I'm lazy and don't intend to act like I know the best way to implement it for the other backends.
I intend to make it a global setting like Delayed::Worker.max_run_time, but also allow local override. For instance:
The reasoning here would be that we normally would allow duplicates to be queued up, but that we won't want to queue up the same job if the same job is already queued and hasn't been started yet, since the end result would be the same (enough).
The text was updated successfully, but these errors were encountered:
I noticed (or failed to) that there's no way to prevent duplicate jobs from being queued up. I was wondering whether this feature would be pulled in if I was to write it. Specifically, I would only be developing it for the ActiveRecord backend: I'm lazy and don't intend to act like I know the best way to implement it for the other backends.
I intend to make it a global setting like Delayed::Worker.max_run_time, but also allow local override. For instance:
Delayed::Worker.allow_duplicates = true
Delayed::Job.enqueue( RefreshTwitterFeedJob.new(username), :allow_duplicates => false )
The reasoning here would be that we normally would allow duplicates to be queued up, but that we won't want to queue up the same job if the same job is already queued and hasn't been started yet, since the end result would be the same (enough).
The text was updated successfully, but these errors were encountered: