Skip to content

Commit

Permalink
Update get-started-incident-intelligence.mdx
Browse files Browse the repository at this point in the history
  • Loading branch information
RoiBar1 committed Feb 27, 2022
1 parent 9810a63 commit 00dc5c2
Showing 1 changed file with 1 addition and 190 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -69,15 +69,6 @@ You can get data from any of the following sources:
</Callout>

</Collapser>

<Collapser
className="freq-link"
id="configure-algorithmia"
title="Datarobot (formerly Algorithmia)"
>
By integrating Incident Intelligence with your Datarobot machine-learning models, you can monitor your machine learning model performance. To configure Datarobot for Incident Intelligence, see our [integration docs](/docs/integrations/mlops-integrations/algorithmia-mlops-integration/).

</Collapser>

<Collapser
className="freq-link"
Expand Down Expand Up @@ -121,93 +112,6 @@ Adding anomalies as a source won't affect your current Proactive Detection confi

</Collapser>

<Collapser
className="freq-link"
id="configure-source-aws"
title="AWS"
>
You can integrate Incident Intelligence with Amazon CloudWatch to provide incident management for all of your AWS services.

To integrate Amazon CloudWatch:

1. Go to **[one.newrelic.com](https://one.newrelic.com)** and click **Alerts & AI**.
2. On the left under **Incident Intelligence**, click **Sources** and then click **Amazon Web Services**.
3. Copy the URL.
4. [Create a new Amazon SNS topic](https://docs.aws.amazon.com/sns/latest/dg/sns-getting-started.html).
5. Set CloudWatch to forward all **Alarms** state changes to that topic:

* In the Amazon CloudWatch UI, click **Events > Event Pattern**.
* Select **Service Name > CloudWatch**.
* Select **Event Type > CloudWatch Alarm State Change**.
* Select **Targets > SNS Topic**, and select your new Amazon SNS topic.
6. Create a new subscription:

* In the Amazon AWS UI, click **Create a Subscription**.
* Select your new Amazon SNS topic.
* Select **Protocol > choose HTTPS**.
* In **Endpoint**, paste the URL you previously copied from the Applied Intelligence **Sources**.
</Collapser>

<Collapser
className="freq-link"
id="configure-source-grafana"
title="Grafana"
>
You can integrate Incident Intelligence with Grafana's notifications for insight into events across your applications and environment. Grafana's webhook notification is a simple way to send information over HTTP to a custom endpoint.

To integrate Grafana as a new webhook:

1. Log into your Grafana portal using Admin permissions, and choose **Alerting**.
2. On the Grafana **Notification Channels** page, click **New Channel > Webhook**.
3. Go to **[one.newrelic.com](https://one.newrelic.com)** and click **Alerts & AI**.
4. On the left under **Incident Intelligence**, click **Sources**, and then click **Grafana**.
5. Copy the URL, and paste it into your new Grafana webhook.
</Collapser>

<Collapser
className="freq-link"
id="configure-source-pagerduty"
title="PagerDuty"
>
<Callout title="EOL NOTICE">
As of October 2021, we've discontinued support for several capabilities with PagerDuty, including suggested responders, golden signals, and component enrichment. For more details, including how you can easily make this transition, see our [Explorers Hub post](https://discuss.newrelic.com/t/upcoming-changes-to-capabilities-and-support-across-node-agents-suggested-pagerduty-responders-golden-signals-and-components-incident-workflows-mobile-agent-cross-application-tracing-cat-and-kubernetes-instrumentation/164481).
</Callout>

You can integrate Incident Intelligence directly with your PagerDuty services to ingest, process, and enhance all of your PagerDuty incidents. Connecting PagerDuty services to Applied Intelligence will not affect your current services or notifications.

To get data from PagerDuty:

1. Make sure your [PagerDuty API key](https://support.pagerduty.com/docs/generating-api-keys) has write access.
2. From **[one.newrelic.com](https://one.newrelic.com)**, click **Alerts & AI**.
3. On the left under **Incident Intelligence**, click **Sources** and then click **PagerDuty**.
4. Enter your [PagerDuty API key](https://support.pagerduty.com/docs/generating-api-keys).
* The key should be either a personal or general access key with write access. If it's created by a user, the user should be an admin.
5. Select the PagerDuty services you want to connect to Applied Intelligence, and click **Connect**.

You can add additional services or remove services you've already connected in **Sources > PagerDuty**.
</Collapser>

<Collapser
className="freq-link"
id="configure-source-prometheus"
title="Prometheus Alertmanager"
>
By integrating Incident Intelligence with Prometheus Alertmanager, you can receive and correlate your Prometheus alerts with events from other sources.

To integrate Prometheus Alertmanager:

1. Set up your Alertmanager configuration file by running:

```
./alertmanager -config.file=simple.yml
```
2. Go to **[one.newrelic.com](https://one.newrelic.com)** and click **Alerts & AI**.
3. On the left under **Incident Intelligence**, click **Sources** and then click **Prometheus Alertmanager**.
4. Copy the Prometheus Alertmanager URL, and paste it into the `<webhook_config>/url` section of your Alertmanager config file.
5. Reload the Prometheus Alertmanager configuration with one of the two methods:
* Send a `SIGHUP` to the process.
* Send an HTTP `POST` request to the `/-/reload` endpoint.
</Collapser>
<Collapser
className="freq-link"
id="configure-source-rest-api"
Expand All @@ -223,100 +127,7 @@ Adding anomalies as a source won't affect your current Proactive Detection confi
For more information on authentication and the full API reference, see [REST API for New Relic Applied Intelligence](/docs/rest-api-new-relic-ai).
</Collapser>

<Collapser
className="freq-link"
id="configure-source-splunk"
title="Splunk"
>
By integrating Incident Intelligence with your Splunk log monitoring, you can:

* Use your environment's log data for searches and key term reports.
* Correlate alerts and search reports with your other metrics and incidents.

<Callout variant="important">
Applied Intelligence supports Splunk Light, Splunk Cloud, and Splunk Enterprise version 6.3 and higher.
</Callout>

To get data from Splunk:

1. In your **Splunk console**, start a search for the relevant events.
2. Save your search as an alert, configure your alert conditions, and then choose the webhook as the delivery method.
3. Go to **[one.newrelic.com](https://one.newrelic.com)** and click **Alerts & AI**.
4. On the left under **Incident Intelligence**, click **Sources** and then click **Splunk**.
5. Copy the collector URL, and paste it into the webhook endpoint in the Splunk console.
6. Optional: Use Splunk tokens to [enrich alert data with Splunk metadata](#enrich-splunk-search).
</Collapser>

<Collapser
className="freq-link"
id="enrich-splunk-search"
title="Splunk metadata"
>
To enrich alerts data with your Splunk metadata, use Splunk tokens. This helps you leverage your search data, which includes metadata and values from the first row of search results.

<table>
<thead>
<tr>
<th style={{ width: "200px" }}>
If you want to...
</th>

<th>
Do this...
</th>
</tr>
</thead>

<tbody>
<tr>
<td>
Access search data
</td>

<td>
Use the format `$<fieldname>$`. For example, use `$app$` for the app context for the search.
</td>
</tr>

<tr>
<td>
Access field values
</td>

<td>
To access field values from the first result row that a search returns, use the format `$result.<fieldname>$`. For example, use `$result.host$` for the host value and `$result.sourcetype$` for the source type.
</td>
</tr>

<tr>
<td>
Use variables
</td>

<td>
You can leverage any of the **Selected fields** in the Splunk search and add any unique fields to the **Selected fields** to make the data available as a variable.

The following fields will automatically provide hints to the correlation engine:

* `app`: parsed as `APPLICATION_NAME`
* `application:`parsed as `APPLICATION_NAME`
* `application_name`: parsed as `APPLICATION_NAME`
* `cluster`: parsed as `CLUSTER_NAME`
* `computer`: parsed as `HOST_NAME`
* `Dc`: parsed as `DATACENTER_NAME`
* `datacenter`: parsed as `DATACENTER_NAME`
* `host`: parsed as `HOST_NAME`
* `host_name`: parsed as `HOST_NAME`
* `hostname:` parsed as `HOST_NAME`
* `transaction`: parsed as `EVENT_ID`
* `Transaction_id`: parsed as `EVENT_ID`
* `user`: parsed as `USER_NAME`
</td>
</tr>
</tbody>
</table>
</Collapser>


</CollapserGroup>

## 3. Configure destinations [#2-configure-destinations]
Expand Down

0 comments on commit 00dc5c2

Please sign in to comment.