Add Elements::BEFORE_BULK_ELEMENT_SAVING_EVENT
& Elements::AFTER_BULK_ELEMENT_SAVING_EVENT
events
#14000
-
The ProblemCurrently, there's no great way to know when a bulk element saving operation is in progress. There is an It doesn't handle other cases, such as:
The ReasonThere are a number of plugins such as Blitz and SEOmatic that need to perform lengthy operations after elements are updated, which are typically done by pushing a queue job. In the case of Blitz, it will regenerate cached pages. In the case of SEOmatic, it will generate sitemaps. However during bulk operations, this can cause repeated and redundant queue jobs to be pushed into the queue unnecessarily. Thus each plugin must implement some mechanism for attempting to detect when such a bulk operation is in progress. For example, this is how Blitz does it: Unfortunately, this means that each plugin must re-invent the wheel, and it also does not handle cases where a custom plugin or module is doing some kind of bulk element saving operation. The SolutionA potential solution would be adding the events This would essentially just formalize the pattern that Blitz is using, and these events could then be triggered by Then anyone listening for the events could know a bulk operation was starting, and postpone any work they would normally do until an event is fired notifying that the operation is done. Potentially this could also be used to reduce the number of queue jobs that are needed for things like pruning revisions, updating search indexes, etc. The Bonus PointsEven better would be a somewhat more elaborate system that would allow for the batching of elements being updated. For example, on an active site with lots of content authors editing various things, a number of queue jobs are generated for updating search indexes, refreshing Blitz caches, pruning revisions, regenerating sitemap caches, and so on. These are all things that need to be done, but would be more efficient if they were coalesced into batches so that a cascade of often redundant or overlapping queue jobs isn't necessary. SEOmatic tries to solve this by not pushing too many queue jobs... we can't check the queue jobs table directly, because then it would only work if they are using the db-based queue. Instead it caches a bit of data so it knows that a queue job for this particular sitemap has already been pushed into the queue, and hasn't completed yet: https://github.com/nystudio107/craft-seomatic/blob/develop-v4/src/models/SitemapTemplate.php#L153 |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
I run into this ☝🏼 a lot. One possible implementation would be for Craft to provide some start/end structure, sort of analogous to a transaction, for wrapping bulk operations:
or
|
Beta Was this translation helpful? Give feedback.
-
Took a stab at this here: #14032 Let me know if this looks like it would work for you! |
Beta Was this translation helpful? Give feedback.
-
Looks great... I left a comment on the PR |
Beta Was this translation helpful? Give feedback.
Took a stab at this here: #14032
Let me know if this looks like it would work for you!