You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 20, 2019. It is now read-only.
One thing I'm aware of with the Laravel queue is that should a job fail the queue doesn't stop running, it merely moves the job to the failed jobs table, then moves onto the next job. Let's say the failing job is caused by the projector not being able to handle a particular event for whatever reason.
If the application in question has additional jobs in the queue which apply to the same source and therefore build on top of the projection that our failed job couldn't create/update, is there then not a problem in that the queue will continue processing and the projector will either update a projection that has missing data, or alternatively (if it was never instantiated) just result in a load more failed jobs?
The text was updated successfully, but these errors were encountered:
Hi, I have the same question, too. When using event sourcing, handling events in sequence is a very important thing, if not, the aggregate root will be wrong, or can't reconstitute. So I think when one event fail, all others shouldn't be handled. Actually we can do this by ourself in the aggregate root, something like:
// a method in aggregate rootpublicfunctionprepareOrder(string$timestamp): OrderAggregateRoot
{
if (!$this->picked) {
throwCouldNotChangeStatus::notPickedYet();
}
if ($this->prepared) {
throwCouldNotChangeStatus::alreadyPrepared();
}
$this->recordThat(newOrderPrepared($timestamp));
return$this;
}
Just don't execute recordThat() when it shouldn't.
And I also have a related question about queue job. As above, the sequence of events is important, so we should only use one consumer to handle one queue, is that right?
That's say I'm building a EC application, and I don't want the order of a customer slow down the other customer. For example there are process events with a normal order: OrderCreated, OrderPaid, OrderNotified, OrderPrepared, OrderDelivered, if I only use a consumer (worker) to handle all these events, when there are multiple orders then my application couldn't handle the orders in parallel.
In v1 of this package there was some extra functionality that guaranteed that a projector wouldn't even get newer events when processing an older event failed. Because it was quite heavy logic and difficult to get right I removed this from v2.
So currently this package doesn't handle failed jobs. You should take care of that yourself. If the order of events is important for your projector you can't indeed just process the next job. The responsability of handling these situations lies with your application.
One thing I'm aware of with the Laravel queue is that should a job fail the queue doesn't stop running, it merely moves the job to the failed jobs table, then moves onto the next job. Let's say the failing job is caused by the projector not being able to handle a particular event for whatever reason.
If the application in question has additional jobs in the queue which apply to the same source and therefore build on top of the projection that our failed job couldn't create/update, is there then not a problem in that the queue will continue processing and the projector will either update a projection that has missing data, or alternatively (if it was never instantiated) just result in a load more failed jobs?
The text was updated successfully, but these errors were encountered: