Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 13 additions & 19 deletions docs/tutorials/influxdb-alerting/add-alerting.md
Original file line number Diff line number Diff line change
@@ -1,22 +1,18 @@
# Add an alerting service using PagerDuty

To add a PagerDuty alerting service to your pipeline:
You'll now add a destination that provides alerting. The following screenshot illustrates what the new service will look like, when added into your pipeline:

1. Sign up for a free PagerDuty account.

2. In PagerDuty, create a new service, using the Events v2 API as the integration. Make sure you obtain a routing key, also called the integration key, from the `Integrations` tab of the service.
![Pipeline](./images/alerting-pipeline.png)

3. In Quix, create a new destination on the output of your CPU Threshold service, the following screenshot illustrates what the completed pipeline looks like:
To add a PagerDuty alerting destination to your pipeline:

![Pipeline](./images/alerting-pipeline.png)
1. Sign up for a free PagerDuty account.

4. Edit `requirements.txt` to use Quix Streams v2:
2. In PagerDuty, create a new service, using the Events v2 API as the integration. Make sure you obtain a routing key, also called the integration key, from the `Integrations` tab of the service.

```
quixstreams==2.2.0a0
```
3. In Quix, create a new destination on the output of your CPU Threshold service. Choose the starter destination that uses Streaming Data Frames.

5. Edit `main.py` and replace the existing code with the following:
4. Edit `main.py` and replace the existing code with the following:

``` python
import os
Expand All @@ -27,8 +23,7 @@ To add a PagerDuty alerting service to your pipeline:
from typing import Dict
from typing import Optional
from quixstreams import Application
from quixstreams.models.serializers.quix import JSONDeserializer


def build_alert(title: str, alert_body: str, dedup: str) -> Dict[str, Any]:
routing_key = os.environ["PAGERDUTY_ROUTING_KEY"]
return {
Expand Down Expand Up @@ -61,13 +56,12 @@ To add a PagerDuty alerting service to your pipeline:

def pg_message(row):
alert_title = "CPU Threshold Alert"
alert_msg = "Average CPU load has reached " + str(row["value"]) + " %"
print("Sending PagerDuty alert")
send_alert(alert_title, alert_msg)
send_alert(alert_title, row["alert"]["message"])
return

app = Application.Quix("pagerduty-v1", auto_offset_reset="latest")
input_topic = app.topic(os.environ["input"], value_deserializer=JSONDeserializer())
input_topic = app.topic(os.environ["input"])

sdf = app.dataframe(input_topic)
sdf = sdf.update(pg_message)
Expand All @@ -78,16 +72,16 @@ To add a PagerDuty alerting service to your pipeline:

This code is based on the PagerDuty example code.

6. Now create a new secret to store your routing key, `PAGERDUTY_ROUTING_KEY`. You'll also need to create a corresponding environment variable, `PAGERDUTY_ROUTING_KEY`:
5. Now create a new secret to store your routing key, `PAGERDUTY_ROUTING_KEY`. You'll also need to create a corresponding environment variable, `PAGERDUTY_ROUTING_KEY`:

![Environment variables](./images/alerting-pipeline-variables.png)

7. Now deploy your service.
6. Deploy the service.

When a message, which is in the JSON format, is received by the alerting service, it means the average CPU load has exceeded its specified threshold. An alert is therefore created and sent to PagerDuty. The PagerDuty service then notifies your team of the incident. You can also check your activity in PagerDuty:

![PagerDuty activity](./images/pager-duty-activity.png)

## 🏃‍♀️ Next step

[Part 10 - Summary :material-arrow-right-circle:{ align=right }](./summary.md)
[Part 6 - Summary :material-arrow-right-circle:{ align=right }](./summary.md)
47 changes: 0 additions & 47 deletions docs/tutorials/influxdb-alerting/create-transform.md

This file was deleted.

17 changes: 0 additions & 17 deletions docs/tutorials/influxdb-alerting/data-explorer.md

This file was deleted.

50 changes: 0 additions & 50 deletions docs/tutorials/influxdb-alerting/downsampling.md

This file was deleted.

4 changes: 2 additions & 2 deletions docs/tutorials/influxdb-alerting/external-source.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Create an external source
# Add an external source

At this point you have an external program sending data into Quix, and it is writing into a Quix topic. However, you can't see this external program in the Pipeline view. To help you visualize what you've created, you can add an external source component, to provide a visual entity in the pipeline view. To do this, log into Quix Cloud:

Expand Down Expand Up @@ -39,5 +39,5 @@ Note, this video is for a different project, but the principle is the same.

## 🏃‍♀️ Next step

[Part 3 - Develop a transform :material-arrow-right-circle:{ align=right }](./create-transform.md)
[Part 3 - Add InfluxDB destination :material-arrow-right-circle:{ align=right }](./influxdb-destination.md)

Binary file not shown.
Binary file modified docs/tutorials/influxdb-alerting/images/alerting-pipeline.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
22 changes: 12 additions & 10 deletions docs/tutorials/influxdb-alerting/influxdb-destination.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,25 @@
# Add an InfluxDB destination connector

You learned how to do this in the [InfluxDB Quickstart](../../integrations/databases/influxdb/quickstart.md). Make sure the input to the destination is the `cpu-load-transform` topic.
Now add an InfluxDB destination. In this case you'll subscribe to data coming from the external source (CPU load data from your laptop in this case) and then write it directly to InfluxDB for persistence.

![InfluxDB query](./images/influxdb-query.png)
!!! tip

You learned how to do this in the [InfluxDB Quickstart](../../integrations/databases/influxdb/quickstart.md).

Configure the connector with your InfluxDB credentials. Deploy your connector.
Make sure the input to the destination is the `cpu-load` topic.

Your pipeline now looks like this:
Configure the connector with your InfluxDB credentials. Deploy your connector. Raw CPU load data is stored in InfluxDB.

![InfluxDB alerting pipeline](./images/influxdb-alerting-pipeline.png)
You can now log into your InfluxDB Cloud account and query your bucket for data.

You can now log into your InfluxDB Cloud account and query your bucket for data. The following screenshot shows the results for a typical query:
## Optional filtering

![InfluxDB query](./images/influxdb-query.png)
In this case you connected your InfluxDB destination (sink) directly to the External Source. All inbound data is therefore written to InfluxDB. In some cases you may prefer to filter the data before writing it to InfluxDB. To do this simply add a transform to the output of the External Source, add the filtering code suitable for your use case, and then connect the InfluxDB destination to the output of your transform. See the next step for an [example on how to do a filtering tranasform](./threshold-detection.md), should you need to refilter data before writing it to InfluxDB.

You have now concluded the first part of the pipeline, where you learned how to get data into Quix, transform it, and stream that data to InfluxDB. You saw that very little code and configuration was required, and you worked in Python.
## Optional reading back from InfluxDB

In the next part of the tutorial you build a pipline with an InfluxDB source (this queries InfluxDB using polling for new data), add a threshold detection transform, and add an alerting service.
You could optionally add an InfluxDB source connector to your pipeline. You learned how to do this in the [InfluxDB Quickstart](../../integrations/databases/influxdb/quickstart.md). This would enable you to read data from your InfluxDB database, and publish it to a topic of your choice. Once data is published to a topic, you can add any additional processing required by connecting transforms you develop in Python to this topic. For a detailed example of this see the [Predictive maintenance tutorial](../predictive-maintenance/overview.md).

## 🏃‍♀️ Next step

[Part 6 - Add InfluxDB source :material-arrow-right-circle:{ align=right }](./influxdb-source.md)
[Part 4 - Add threshold detection :material-arrow-right-circle:{ align=right }](./threshold-detection.md)
15 changes: 0 additions & 15 deletions docs/tutorials/influxdb-alerting/influxdb-source.md

This file was deleted.

22 changes: 8 additions & 14 deletions docs/tutorials/influxdb-alerting/overview.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
# Alerting with InfluxDB, Quix Streams and PagerDuty
# Event detection and alerting featuring InfluxDB and PagerDuty

In this tutorial you learn how to create a CPU overload alerting pipeline with Quix, Quix Streams, InfluxDB, and PagerDuty.

You gather CPU data from your laptop, and store this directly in InfluxDB. You also add a real-time event detection transform to detect if your CPU exceeds a threshold value, and if so, sends an alert to PagerDuty.

!!! note

This tutorial uses Quix Streams v2.
Expand Down Expand Up @@ -38,23 +40,15 @@ This tutorial is divided up into several parts, to make it a more manageable lea

1. [Write the Python client](./python-client.md) - you write a command-line program using Quix Streams to get CPU load data into your pipeline.

2. [Create an external source](./external-source.md) - you create an external source - this enables your command-line program to be visible in the pipeline.

3. [Develop a transform](./create-transform.md) - you write a transform to convert inbound JSON data to a Quix format to be compatible with our InfluxDB connector and Quix data explorer.

4. [Examine your data](./data-explorer.md) - you use Quix data explorer to examine the data produced by your transform.

5. [Add an InfluxDB destination](./influxdb-destination.md) - you add a Quix InfluxDB destination connector to your pipeline.

6. [Add an InfluxDB source](./influxdb-source.md) - you add a Quix InfluxDB source connector to your pipeline.
2. [Add an external source](./external-source.md) - you add an external source - this enables your command-line program to be visible in the pipeline.

7. [Add threshold detection](./threshold-detection.md) - you add a threshold detection transform.
3. [Add an InfluxDB destination](./influxdb-destination.md) - you add a Quix InfluxDB destination connector (sink) to your pipeline. CPU load data is stored directly in InfluxDB.

8. [Downsample your data](./downsampling.md) - you use an aggregation to downsample your data.
4. [Create a threshold detection transform](./threshold-detection.md) - you develop a threshold detection transform. This determines if a CPU load threshold has been exceeded, and if so publishes a message to the output topic.

9. [Add alerting](./add-alerting.md) - add alerting using PagerDuty.
5. [Create an alerting sink](./add-alerting.md) - adds alerting using PagerDuty. You add a PagerDuty destination (sink) to the pipeline. If a message is received by the sink, a message is sent to PagerDuty.

10. [Summary](./summary.md) - conclusion and next steps.
6. [Summary](./summary.md) - conclusion and next steps.

## 🏃‍♀️ Next step

Expand Down
Loading