Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added code to throw retry exception on any exception while executing sendNewAlerts #110

Closed
wants to merge 1 commit into from

Conversation

AltamashShaikh
Copy link
Contributor

Added code to throw retry exception on any exception while executing sendNewAlerts
Fixes: #107

Description:

Please include a description of this change and which issue it fixes. If no issue exists yet please include context and what problem it solves.

Review

$this->notifier->sendNewAlerts($period, (int) $idSite);
} catch (\Exception $e) {
if (class_exists('\Piwik\Scheduler\RetryableException')) {
throw new \Piwik\Scheduler\RetryableException($e->getMessage());
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tsteur not sure if this is the best place in the code to throw this error as I was not able to decide which errors should we retry, so targeted all the errors arising while sending alerts

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure this would work. Effectively, if sending the alert fails, then we would process all alerts again and possible send the same alert twice later.

At the same time, if processing an alert fails, then it would currently not be retried.

If I see this right then we might need to adjust eg shouldBeProcessed to check when the alert has been processed and sent last to know if we need reprocess a task or so?
Be great if you could look into this and see what happens in those cases.

The goal is that we retry the failed process alert or retry a failed sent alert while not double sending it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tsteur
I looked into possible solutions yesterday and before that went into working of the code. It is something like this

Part 1 - ProcessAlerts
Gel all the alerts for the period
Select * from matomo_alert where period='day'
Add the same in alert_triggered table with ts_last_sent=null

Part 2 - sendAlerts
Select * from alert_triggered where period='day' and ts_triggered between date
Group alerts by EmailRecipent
	Emailrecipent1 => [alert1,alert2..alertn],
	Emailrecipent2 => [alert1,alert2..alertn],
Send alert

Group alerts by Sms
	SMSrecipent1 => [alert1,alert2..alertn],
	SMSrecipent2 => [alert1,alert2..alertn],
Send alert

Update the 	ts_last_sent column

We can add a try catch on "Group alerts by EmailRecipent" and "Group alerts by Sms"
Store the alert idtriggered in a array on exception
Do not update the ts_last_sent when idtriggered exists in the array populated during exception

Problem here is that even though it might have ran successfully for Emailrecipent1 but it wont for Emailrecipent2 and we will retry again entire alert for Emailrecipent1 and Emailrecipent2

The major problem here is we want to club and sent the alerts but retry is working per ID and even if we do not update ts_last_sent for failed alerts the alert will run again for entire idsite.

I am still thinking what could be the best way to do this so that only the failed alerts are processed.
Can we change the database and add 1 more column like status:retry and when its re-triggered via retry exception and 1 more where clause while processing alerts so that only the ones with retry are picked ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding new columns is no problem 👍

@AltamashShaikh AltamashShaikh marked this pull request as ready for review March 7, 2022 02:00
@AltamashShaikh AltamashShaikh marked this pull request as draft March 7, 2022 02:01
@snake14
Copy link
Contributor

snake14 commented May 4, 2023

@AltamashShaikh Should this be closed or merged?

@AltamashShaikh
Copy link
Contributor Author

@snake14 lest close this, wasn't able to get it working much

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Schedule a retry for any alerts that fail
3 participants