Skip to content

Code snippets and samples for ingesting data into Azure Operator Insights Data Products

License

Notifications You must be signed in to change notification settings

Azure-Samples/operator-insights-data-ingestion

Repository files navigation

Azure Operator Insights data ingestion samples

Code samples and guidance demonstrating how to ingest data into an Azure Operator Insights Data Product.

Azure Operator Insights offers a range of options for ingesting data into Data Products:

  • The Azure Operator Insights ingestion agent, which runs on-premises or on an Azure VM. The agent can consume data from different sources and upload the data to an Azure Operator Insights Data Product. The agent currently supports ingestion by:
    • Pulling data from an SFTP server
    • Terminating a TCP stream of enhanced data records (EDRs) from the Affirmed MCC.
  • Ingestion via other methods:
    • Existing Microsoft or third-party tools which upload data to Azure. E.g. AzCopy or Azure Data Factory.
    • Custom code, e.g. using the Azure CLI or Azure SDK.

The Azure Operator Insights ingestion agent is the recommended ingestion method for the use cases it supports.

For other use cases, this repository provides samples of ingestion using common Microsoft tools or custom code.

Getting Started

Read this README then explore the samples directory: samples. When you want more in-depth guidance on implementing ingestion, read INGESTION-OVERVIEW.md.

Prerequisites

For most of the samples, you need:

  • An Azure Operator Insights Data Product
  • A Microsoft Entra identity to use to run the sample. The identity must have the following roles:
    • Reader role on your Data Product
    • Key Vault Secrets User role on the managed key vault associated with your Data Product.

Some samples have different prerequisites, which are listed in the code or README files for each sample (e.g. for Azure Databricks).

Quickstart

  1. git clone https://github.com/Azure-Samples/operator-insights-data-ingestion.git
  2. cd operator-insights-data-ingestion/samples
  3. Choose a sample, update the required parameters to work with your Data Product and data source, and run the sample.

Next steps

The samples in this repository demonstrate how to implement basic ingestion into a Data Product, and they are provided for prototyping and educational purposes only.

Before using these samples as part of a production system, you should consider what additional logging, security, performance and resilience you need to add to make the code suitable for production use.

Samples

End-to-end samples

Most end-to-end samples follow three steps:

  1. Find the name of the managed key vault associated with the Data Product, by querying the Data Product to find the Data Product's unique ID
  2. Authenticate with the managed key vault and fetch the secret containing the ingestion SAS URL for the Data Product.
  3. Using the ingestion URL/storage account name and ingestion SAS token, upload data to the ingestion endpoint of the Data Product.

Helper code snippets

Common parameters

Most samples require the same parameters:

  • resourceGroup: the resource group where the Data Product is deployed
  • dataProductName: the name of the Data Product to upload data to
  • dataType: the data type name for the data to upload, e.g. edr or pmstat. The valid data types for a Data Product are listed on the Data Management > Data types pane of the Azure Portal for the Data Product.
  • sourceDataFilePath: the path to a local directory containing the files you want to upload to the Data Product

All samples must authenticate with the managed key vault that is associated with the Data Product. This authentication is handled differently depending on the sample.

Resources

About

Code snippets and samples for ingesting data into Azure Operator Insights Data Products

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •