Improve Telegram channel's performance with large alert notifications #48974
Replies: 5 comments 1 reply
-
We're having the same issue now - looking into whether we can truncate the message to a certain length at the template. At least the alerts will appear, just not all of them |
Beta Was this translation helpful? Give feedback.
-
Also with the same issue... I've created a second notification template that I called short just with alert titles to be able to flush the queue. Maybe something simple as def generate_messages(alerts):
msg = apply_template(alerts)
if len(msg.encode()) < 4096:
return [msg]
return generate_messages(alerts[:len(alerts)/2]) + generate_messages(alerts[len(alerts)/2:]) To generate different telegram messages instead of only sending in one. Apologies for using python as pseudo-code instead of golang :) |
Beta Was this translation helpful? Give feedback.
-
I also faced the same problem. As a palliative solution, it reduces the size of the message sent to telegram, containing only the description and name of the alert. translation performed via google translator |
Beta Was this translation helpful? Give feedback.
-
Hello, as you may have heard, we are transitioning away from using discussions to discuss feature requests. edit: This discussion was migrated to an issue - #83412 |
Beta Was this translation helpful? Give feedback.
-
This still has some issues. Messages arn't sen when few of them occur. Should split into chunks. Seems to work when using default template. For this template it do not work: {{- /* Telegram message to use: {{ template "telegram.message2" . }} */ -}}
{{ define "__alerts_list" -}}
{{ range . }}
{{if ne (index .Labels "alertname") "" -}}
{{ if eq .Status "firing" }}🔴{{ else }}🟢{{ end }}
{{- if ne (index .Labels "severity") "" -}}
<u><b>P{{ index .Labels "severity" }}</b></u> {{ end -}}
<b>{{ index .Labels "alertname" }}</b> 🕙 {{ .StartsAt.Format "15:04:05 🗓️ 2006-01-02" }}{{ end -}}
{{ if ne (index .Annotations "summary") "" }}
{{ index .Annotations "summary" }}{{ end -}}
{{ if ne (index .Annotations "description") "" }}
{{ index .Annotations "description" }}{{ end -}}
{{- if gt (len .Annotations) 2 }}
<i>Annotations:</i>
{{ range .Annotations.SortedPairs -}}
{{- if and (ne .Name "description") (ne .Name "summary") -}}
- {{ .Name }}: {{ .Value }}
{{ end }}
{{- end -}}{{- /* range */ -}}
{{- end }}
{{ if len .Labels -}}
<i>Labels:</i>
{{- range .Labels.SortedPairs }}
- {{ .Name }}: {{ .Value }}
{{- end -}}{{- /* range */ -}}
{{- end }}
<i>Value:</i> <pre>{{ .ValueString }}</pre>
{{- if gt (len .GeneratorURL) 0 }}<a href="{{ .GeneratorURL }}">source</a> | {{ end }}
{{- if gt (len .SilenceURL) 0 }}<a href="{{ .SilenceURL }}">🔕 silence</a> | {{ end }}
{{- if gt (len .DashboardURL) 0 }}📁 <a href="{{ .DashboardURL }}">dashboard</a> | {{ end }}
{{- if gt (len .PanelURL) 0 }}<a href="{{ .PanelURL }}">panel</a> {{- end -}}
<pre>--------</pre>
{{- end -}} {{- /* range */ -}}
{{- end -}} {{- /* define __alerts_list */ -}}
{{ define "__telegram.title" -}}
{{ if ne (index .CommonLabels "severity") "" }} <u><b>P{{ index .CommonLabels "severity" }}</b></u> {{ end -}}
{{ if ne (index .CommonLabels "alertname") "" -}}
[{{ index .CommonLabels "alertname" }}]
{{- end -}}
{{- end -}}{{- /* define __telegram */ -}}
{{ define "telegram.message" }}
{{ if gt (len .Alerts.Firing) 0 }}
🚨 <b>ALARM</b> (#{{ .Alerts.Firing | len }})
{{- template "__alerts_list" .Alerts.Firing }}{{ end -}}
{{ if gt (len .Alerts.Resolved) 0 }}
✅ <b>RESOLVED</b> (#{{ .Alerts.Resolved | len }})
{{- template "__alerts_list" .Alerts.Resolved }}{{ end }}
<a href="{{ .ExternalURL }}">📲 Grafana</a>
{{- end -}}
{{- template "telegram.message" . -}} |
Beta Was this translation helpful? Give feedback.
-
What happened?
We've configured a contact point serving alerts both through email and Telegram on our Grafana instance. We are currently under a heavy alert load and we have noticed notifications are flawlessly delivered through email messages but the same is not true for Telegram...
What you expected to happen
We expected the notifications to be correctly delivered both over email and Telegram instead on just through the former.
How to reproduce it
TL;DR: Make a fairly large number of alerts (35 in our case) trigger to prevent Telegram from sending notifications.
We observed Telegram stopped sending alerts when under pressure in terms of the number of alerts. In order to test stuff out we prepared a collection of 'synthetic' alerts we could trigger at will. We found that when all the alerts were triggering Telegram wasn't capable of sending them. On the other hand, when we silenced most of them through the use of tags we suddenly discovered Telegram was sending the alerts again!
Anything else we need to know?
Given the behaviour we've just explained above we decided to take a close look at Grafana's logs to try and shed some light on the issue. In order to do so we added the following option to increase the log's verbosity on the config file (i.e.
/etc/grafana/grafana.ini
):After doing so we inspected our logs (over at
/var/log/grafana/grafana.log
) and we found the following:They key bit is the one that says:
Given we are invoking the
sendMessage
method on Telegram's API we can just check the error codes there. However, given thedescription
we can be sure about what the likely cause of the problem is: we are trying to fit in way too many characters in the messages body. Given we're trying to squeeze in a ton of alerts that's something that can make sense.When we have a couple of alerts triggering notifications make it through and we see the following entries in the log:
Another key behaviour is that Telegram's Test notifications always seem to work... unless we make them too big! In order to do so we can just write a ton of bogus data on the Message field and we will trigger the same error...
So, all in all, we have experienced how Telegram won't notify us of alerts when we have a fairly large number of them triggering...
We don't know if this is the desired behaviour when managing these notifications, but we have been very thrown away by not being notified.
If this were something to alter we believe we could send a message per notification so as not to cross the message length threshold or, at least, surpass it less often. This change in the way of delivering notifications could maybe trigger rate limits of bots when sending messages. We could maybe circumvent this possible hindrance with some sort of rate limiting should we face issues of that nature.
In any case, we're more than happy to hear your suggestions. Note that even though we flagged this as a bug that might not be the case: feel free to change the issue's category as you deem appropriate.
Thanks for your time and sorry for the 'noise'! 😸
Environment
Beta Was this translation helpful? Give feedback.
All reactions