Skip to content

Commit

Permalink
[DOCS-7840] Add OP setup/bootstrap doc (#23094)
Browse files Browse the repository at this point in the history
* add doc

* apply suggested changes

* use alert box for note
  • Loading branch information
maycmlee committed May 14, 2024
1 parent 96e79fa commit 15384d3
Show file tree
Hide file tree
Showing 3 changed files with 113 additions and 37 deletions.
55 changes: 30 additions & 25 deletions config/_default/menus/main.en.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3850,131 +3850,136 @@ menu:
identifier: observability_pipelines
parent: log_management_heading
weight: 10000
- name: Setup
url: observability_pipelines/setup_opw/
parent: observability_pipelines
identifier: observability_pipelines_setup_opw
weight: 1
- name: Log Volume Control
url: observability_pipelines/log_volume_control/
parent: observability_pipelines
identifier: observability_pipelines_log_volume_control
weight: 1
weight: 2
- name: Splunk HTTP Event Collector
url: observability_pipelines/log_volume_control/splunk_hec/
parent: observability_pipelines_log_volume_control
identifier: observability_pipelines_log_volume_control_splunk_hec
weight: 1001
weight: 2001
- name: Splunk Forwarders (TCP)
url: observability_pipelines/log_volume_control/splunk_tcp/
parent: observability_pipelines_log_volume_control
identifier: observability_pipelines_log_volume_control_splunk_tcp
weight: 1002
weight: 2002
- name: Sumo Logic Hosted Collector
url: observability_pipelines/log_volume_control/sumo_logic_hosted_collector/
parent: observability_pipelines_log_volume_control
identifier: observability_pipelines_log_volume_control_sumo_logic_hosted_collector
weight: 1003
weight: 2003
- name: Dual Ship Logs
url: observability_pipelines/dual_ship_logs/
parent: observability_pipelines
identifier: observability_pipelines_dual_ship_logs
weight: 2
weight: 3
- name: Splunk HTTP Event Collector
url: observability_pipelines/dual_ship_logs/splunk_hec/
parent: observability_pipelines_dual_ship_logs
identifier: observability_pipelines_dual_ship_logs_splunk_hec
weight: 2001
weight: 3001
- name: Splunk Forwarders (TCP)
url: observability_pipelines/dual_ship_logs/splunk_tcp/
parent: observability_pipelines_dual_ship_logs
identifier: observability_pipelines_dual_ship_logs_splunk_tcp
weight: 2002
weight: 3002
- name: Sumo Logic Hosted Collector
url: observability_pipelines/dual_ship_logs/sumo_logic_hosted_collector/
parent: observability_pipelines_dual_ship_logs
identifier: observability_pipelines_dual_ship_logs_sumo_logic_hosted_collector
weight: 2003
weight: 3003
- name: Archive Logs
url: observability_pipelines/archive_logs/
parent: observability_pipelines
identifier: observability_pipelines_archive_logs
weight: 3
weight: 4
- name: Datadog Agent
url: observability_pipelines/archive_logs/datadog_agent/
parent: observability_pipelines_archive_logs
identifier: observability_pipelines_archive_logs_datadog_agent
weight: 3001
weight: 4001
- name: Splunk HTTP Event Collector
url: observability_pipelines/archive_logs/splunk_hec/
parent: observability_pipelines_archive_logs
identifier: observability_pipelines_archive_logs_splunk_hec
weight: 3002
weight: 4002
- name: Splunk Forwarders (TCP)
url: observability_pipelines/archive_logs/splunk_tcp/
parent: observability_pipelines_archive_logs
identifier: observability_pipelines_archive_logs_splunk_tcp
weight: 3003
weight: 4003
- name: Sumo Logic Hosted Collector
url: observability_pipelines/archive_logs/sumo_logic_hosted_collector/
parent: observability_pipelines_archive_logs
identifier: observability_pipelines_archive_logs_sumo_logic_hosted_collector
weight: 3004
weight: 4004
- name: Split Logs
url: observability_pipelines/split_logs/
parent: observability_pipelines
identifier: observability_pipelines_split_logs
weight: 4
weight: 5
- name: Datadog Agent
url: observability_pipelines/split_logs/datadog_agent/
parent: observability_pipelines_split_logs
identifier: observability_pipelines_split_logs_datadog_agent
weight: 4001
weight: 5001
- name: Splunk HTTP Event Collector
url: observability_pipelines/split_logs/splunk_hec/
parent: observability_pipelines_split_logs
identifier: observability_pipelines_split_logs_splunk_hec
weight: 4002
weight: 5002
- name: Splunk Forwarders (TCP)
url: observability_pipelines/split_logs/splunk_tcp/
parent: observability_pipelines_split_logs
identifier: observability_pipelines_split_logs_splunk_tcp
weight: 4003
weight: 5003
- name: Sumo Logic Hosted Collector
url: observability_pipelines/split_logs/sumo_logic_hosted_collector/
parent: observability_pipelines_split_logs
identifier: observability_pipelines_split_logs_sumo_logic_hosted_collector
weight: 4004
weight: 5004
- name: Sensitive Data Redaction
url: observability_pipelines/sensitive_data_redaction/
parent: observability_pipelines
identifier: observability_pipelines_sensitive_data_redaction
weight: 5
weight: 6
- name: Datadog Agent
url: observability_pipelines/sensitive_data_redaction/datadog_agent/
parent: observability_pipelines_sensitive_data_redaction
identifier: observability_pipelines_sensitive_data_redaction_datadog_agent
weight: 5001
weight: 6001
- name: Splunk HTTP Event Collector
url: observability_pipelines/sensitive_data_redaction/splunk_hec/
parent: observability_pipelines_sensitive_data_redaction
identifier: observability_pipelines_sensitive_data_redaction_splunk_hec
weight: 5002
weight: 6002
- name: Splunk Forwarders (TCP)
url: observability_pipelines/sensitive_data_redaction/splunk_tcp/
parent: observability_pipelines_sensitive_data_redaction
identifier: observability_pipelines_sensitive_data_redaction_splunk_tcp
weight: 5003
weight: 6003
- name: Sumo Logic Hosted Collector
url: observability_pipelines/sensitive_data_redaction/sumo_logic_hosted_collector/
parent: observability_pipelines_sensitive_data_redaction
identifier: observability_pipelines_sensitive_data_redaction_sumo_logic_hosted_collector
weight: 5004
weight: 6004
- name: Update Existing Pipelines
url: observability_pipelines/update_existing_pipelines/
parent: observability_pipelines
identifier: observability_pipelines_update_existing_pipelines
weight: 6
weight: 7
- name: Best Practices for Scaling Observability Pipelines
url: observability_pipelines/best_practices_for_scaling_observability_pipelines/
parent: observability_pipelines
identifier: observability_pipelines_best_practices_for_scaling_observability_pipelines
weight: 7
weight: 8
- name: Log Management
url: logs/
pre: log
Expand Down
26 changes: 14 additions & 12 deletions content/en/observability_pipelines/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,22 +58,24 @@ The Datadog UI provides a control plane to manage your Observability Pipelines W

## Get started

1. Navigate to [Observability Pipelines][1].
1. [Set up bootstrap options for the Observability Pipelines Worker][1].
1. Navigate to [Observability Pipelines][2].
1. Select a use case:
- [Log volume control][2]
- [Dual ship logs][3]
- [Split logs][4]
- [Archive logs to Datadog Archives][5]
- [Sensitive data redaction][6]
- [Log volume control][3]
- [Dual ship logs][4]
- [Split logs][5]
- [Archive logs to Datadog Archives][6]
- [Sensitive data redaction][7]
1. Enable monitors.

## Further Reading

{{< partial name="whats-next/whats-next.html" >}}

[1]: https://app.datadoghq.com/observability-pipelines
[2]: /observability_pipelines/log_volume_control/
[3]: /observability_pipelines/dual_ship_logs/
[4]: /observability_pipelines/split_logs/
[5]: /observability_pipelines/archive_logs/
[6]: /observability_pipelines/sensitive_data_redaction/
[1]: /observability_pipelines/setup_opw/
[2]: https://app.datadoghq.com/observability-pipelines
[3]: /observability_pipelines/log_volume_control/
[4]: /observability_pipelines/dual_ship_logs/
[5]: /observability_pipelines/split_logs/
[6]: /observability_pipelines/archive_logs/
[7]: /observability_pipelines/sensitive_data_redaction/
69 changes: 69 additions & 0 deletions content/en/observability_pipelines/setup_opw.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
---
title: Setup
kind: documentation
disable_toc: false
further_reading:
- link: "/observability_pipelines/log_volume_control/"
tag: "Documentation"
text: "Log volume control with Observability Pipelines"
- link: "/observability_pipelines/dual_ship_logs/"
tag: "Documentation"
text: "Dual ship logs with Observability Pipelines"
- link: "/observability_pipelines/archive_logs/"
tag: "Documentation"
text: "Archive logs with Observability Pipelines"
- link: "/observability_pipelines/split_logs/"
tag: "Documentation"
text: "Split logs with Observability Pipelines"
- link: "/observability_pipelines/sensitive_data_redaction/"
tag: "Documentation"
text: "Redact data redaction with Observability Pipelines"
- link: "/observability_pipelines/update_existing_pipelines/"
tag: "Documentation"
text: "Update existing pipelines"
---

## Overview

<div class="alert alert-warning">All configuration file paths specified remotely need to be under <code>DD_OP_DATA_DIR/config</code></div>

Bootstrap the Observability Pipelines Worker within your infrastructure before you set up a pipeline. These environment variables are separate from the options in the pipelines configuration file.

## Bootstrap Options

To set bootstrap options, do one of the following:
- Use environmental variables.
- Create a `bootstrap.yaml` and start the Worker instance with `--bootstrap-config /path/to/bootstrap.yaml`.

`api_key`
: env var: `DD_API_KEY`
: Create a [Datadog API key][1] for this environment variable.

`pipeline_id`
: env var: `DD_OP_PIPELINE_ID`
: Create an [Observability Pipelines pipeline ID][2] for this environment variable.

`site`
: env var: `DD_SITE`
: Your Datadog site (optional, default: `datadoghq.com`).
: See [Getting Started with Sites][3] for more information.

`data_dir`
: env var: `DD_OP_DATA_DIR`
: The data directory (optional, default: `/var/lib/observability-pipelines-worker`). This is the file system directory that the Observability Pipelines Worker uses for local state.

`tags: []`
: env var: `DD_OP_TAGS`
: The tags reported with internal metrics and can be used to filter Observability Pipelines instances for Remote Configuration deployments.

`threads`
: env var: `DD_OP_THREADS`
: The number of threads to use for processing (optional, default: the number of available cores).

## Further reading

{{< partial name="whats-next/whats-next.html" >}}

[1]: https://app.datadoghq.com/organization-settings/api-keys
[2]: https://app.datadoghq.com/observability-pipelines
[3]: /getting_started/site/

0 comments on commit 15384d3

Please sign in to comment.