Skip to content

Commit

Permalink
Revert "Revert "fix(lookups): new docs on lookups""
Browse files Browse the repository at this point in the history
  • Loading branch information
zuluecho9 committed Jun 26, 2023
1 parent cc95b0d commit e32b54f
Show file tree
Hide file tree
Showing 11 changed files with 269 additions and 24 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -261,11 +261,12 @@ These capabilities pertain to [incident intelligence](/docs/alerts-applied-intel
id="new-relic-one"
title="New Relic One"
>
These are assorted capabilities related to basic features of the [New Relic platform](/docs/new-relic-solutions/new-relic-one/introduction-new-relic-one):
These are assorted capabilities related to basic features of the [New Relic platform](/docs/new-relic-solutions/new-relic-one/introduction-new-relic-one) (sometimes referred to as New Relic One):
* **Entities**: relates to creating and deleting New Relic-monitored [entities](/docs/new-relic-solutions/new-relic-one/core-concepts/what-entity-new-relic).
* **Entity relationships**: relates to [entity relationships](/docs/new-relic-solutions/new-relic-one/core-concepts/what-entity-new-relic/#related-entities).
* **Golden metrics**: relates to [golden metrics](/docs/apis/nerdgraph/examples/golden-metrics-entities-nerdgraph-api-tutorial) (key metrics) in curated user experiences.
* **Nerdpacks**: relates to [New Relic apps](/docs/new-relic-solutions/new-relic-one/build-custom-new-relic-one-application).
* **NRQL lookups**: relates to the ability to use [lookup tables](/docs/logs/ui-data/lookup-tables-ui).
* **Pixie account link**: this capability allows the creation of an associated Pixie account when adding Pixie to a cluster from our [guided install](/docs/kubernetes-pixie/auto-telemetry-pixie/install-auto-telemetry-pixie).
* **Pixie credentials**: relates to access of linked Pixie accounts.
* **Pixie live data**: enables access to live debugging data in the [Kubernetes cluster explorer](/docs/kubernetes-pixie/kubernetes-integration/understand-use-data/kubernetes-cluster-explorer).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -257,29 +257,20 @@ The Event API accepts specific formats for attributes included in the payload. O
>
The following size and rate limits apply to events sent via the Event API:

* Payload total size: **1MB(10^6 bytes) maximum per POST**. We highly recommend using compression.

* The payload must be encoded as **UTF-8**.

* Maximum number of attributes per event: 255

* Maximum length of attribute name: 255 characters

* Maximum length of attribute value: 4096 characters

* There are rate [limits on the number of HTTP requests per minute](#post-limit) sent to the Event API.

Some specific attributes have additional restrictions:

* `accountId`: This is a reserved attribute name. If it is included, it will be dropped during ingest.

* `entity.guid`, `entity.name`, and `entity.type`: These attributes are used internally to identify entities. Any values submitted with these keys in the attributes section of a metric data point may cause undefined behavior such as missing entities in the UI or telemetry not associating with the expected entities. For more information please refer to [Entity synthesis](/docs/new-relic-one/use-new-relic-one/core-concepts/what-entity-new-relic/#entity-synthesis).

* [`appId`](/docs/apis/rest-api-v2/requirements/find-product-id#apm): Value must be an integer. If it is not an integer, the attribute name and value will be dropped during ingest.

* `eventType`: Can be a combination of alphanumeric characters, `_` underscores, and `:` colons.

* `timestamp`: Must be a Unix epoch timestamp. You can define timestamps either in seconds or in milliseconds.
* Payload total size: **1MB(10^6 bytes) maximum per POST**. We highly recommend using compression.
* The payload must be encoded as **UTF-8**.
* Maximum number of attributes per event: 255
* Maximum length of attribute name: 255 characters
* Maximum length of attribute value: 4096 characters
* There are [rate limits on the number of HTTP requests per minute](#post-limit) sent to the Event API.

Some attributes have additional restrictions:

* `accountId`: This is a reserved attribute name. If it is included, it will be dropped during ingest.
* `entity.guid`, `entity.name`, and `entity.type`: These attributes are used internally to identify entities. Any values submitted with these keys in the attributes section of a metric data point may cause undefined behavior such as missing entities in the UI or telemetry not associating with the expected entities. For more information please refer to [Entity synthesis](/docs/new-relic-one/use-new-relic-one/core-concepts/what-entity-new-relic/#entity-synthesis).
* [`appId`](/docs/apis/rest-api-v2/requirements/find-product-id#apm): Value must be an integer. If it is not an integer, the attribute name and value will be dropped during ingest.
* `eventType`: This attribute can be a combination of alphanumeric characters, `_` underscores, and `:` colons.
* `timestamp`: This attribute must be a Unix epoch timestamp, defined in either seconds or milliseconds.
</Collapser>
</CollapserGroup>

Expand Down
64 changes: 64 additions & 0 deletions src/content/docs/logs/ui-data/lookup-tables-ui.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
---
title: "Upload CSV-format lookup tables"
metaDescription: 'In the New Relic logs UI, upload CSV-format lookup tables and use that data in combination with other New Relic data.'
---

import logsLookupTableUi from 'images/logs_screenshot-crop_lookup-table-ui.webp'

Our **lookups** feature lets you enrich your log data, and other New Relic-stored data, with data about your business that you define in a CSV file.

## Why use lookups? [#overview]

When you upload a lookup table, you can then use that data to enrich your queries of New Relic data. For example, you might upload a table that maps host IDs to human-readable host names, and then use that to create a chart that displays the human-readable host names.

Lookup tables help you:

* Query data that isn't present in your New Relic account
* Make your telemetry data easier to understand
* Group data in custom ways

For examples of queries using lookups, see [How to query lookup table data](/docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/lookups).

## Add and manage tables [#add-table]

<img
title="Upload a csv lookup table"
alt="Upload a csv lookup table"
src={logsLookupTableUi}
/>

The UI for uploading lookup tables is located in the logs UI, but you can use your lookup table data when querying any data type, not just logs.

To find the lookup table UI: From [one.newrelic.com](https://one.newrelic.com), click **Logs**, and then click **Lookup tables**. From there, you can add, update, and delete tables.

Some tips when adding a table:

* The table name is what you'll use to reference that table when you write a [NRQL query](/docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/lookups).
* Table data isn't obfuscated, so avoid uploading sensitive information.
* It can take several minutes for an added or updated table to be available in a NRQL query.

When you delete a table, that data is no longer available in NRQL queries. When deleting a table, make sure its data isn't being used in any important dashboards.

[Learn more table format requirements and other details](#requirements).

## Query your data [#query]

If you're ready to use your table in a query, see [How to query lookup table data](/docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/lookups).

## Requirements and technical details [#requirements]

Some more details about lookup tables:

* Tables can only be queried from the [account](/docs/accounts/accounts-billing/account-structure/new-relic-account-structure/#organization-accounts) in which the table was uploaded.
* Each [account](/docs/accounts/accounts-billing/account-structure/new-relic-account-structure/#organization-accounts) has a limit of 20 lookup tables.
* Lookup table data can't be used in NRQL alert conditions.

In addition to [the general requirements for a CSV file](https://www.rfc-editor.org/rfc/rfc4180), here are our requirements for uploading lookup table files:

* Each row must have the same number of columns as the header and there must be at least two columns.
* Table names must conform to [the rules for event names](/docs/data-apis/ingest-apis/event-api/introduction-event-api/#limits)
* Do not use [reserved words](/docs/data-apis/custom-data/custom-events/data-requirements-limits-custom-event-data/#reserved-words) for table names or column header values.
* Max file size: 4 MB
* Max of 20,000 rows


Original file line number Diff line number Diff line change
Expand Up @@ -146,6 +146,12 @@ As noted in our [basic NRQL syntax doc](/docs/query-your-data/nrql-new-relic-que
SELECT count(*) FROM Transaction, PageView SINCE 3 days ago
```
</Collapser>
<Collapser
id="from-lookups"
title="Query data from a lookup table"
>
See [`lookup()`](#lookup).
</Collapser>
</CollapserGroup>
</Collapser>
</CollapserGroup>
Expand Down Expand Up @@ -656,6 +662,7 @@ As noted in our [basic NRQL syntax doc](/docs/query-your-data/nrql-new-relic-que
</CollapserGroup>
</Collapser>


<Collapser
className="freq-link"
id="sel-offset"
Expand Down Expand Up @@ -2765,6 +2772,24 @@ Note: `aparse()` is case-insensitive.
</CollapserGroup>
</Collapser>

<Collapser
className="freq-link"
id="func-lookup"
title={<InlineCode>lookup(table)</InlineCode>}
>
If you've [uploaded a lookup table](/docs/logs/ui-data/lookup-tables-ui), you can use this function with a table name to access that table's data in a query. Here's an example query:

```sql
FROM Log
SELECT count(*)
WHERE hostname IN (FROM lookup(myHosts) SELECT uniques(myHost))
```

For more information, see [How to query lookup table data](/docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/lookups).

</Collapser>


<Collapser
className="freq-link"
id="func-lower"
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,160 @@
---
title: How to query lookup table data
tags:
- Query your data
- 'NRQL: New Relic Query Language'
- NRQL query tutorials
metaDescription: In New Relic, how to query data added via CSV lookup tables.
redirects:
---

import nrqlExampleStatusCodes from 'images/nrql_screenshot-crop_example-status-codes.webp'

import nrqlLookupQueryTranslateItemIds from 'images/nrql_screenshot-crop_lookup-query-translate-item-ids.webp'

import nrqlLookupQueryGeoip from 'images/nrql_screenshot-crop_lookup-query-geoip.webp'

When you [upload CSV-format lookup tables](/docs/logs/ui-data/lookup-tables-ui), you can use the `lookup()` function to access that data in your NRQL queries.

## Why use lookup tables? [#why]

For why you'd use lookup tables and how to upload them, see [Lookup tables](/docs/logs/ui-data/lookup-tables-ui).

## Basic query syntax [#basic-syntax]

Let's say you've named your table `storeNames`. This query will select all data from that table:

```sql
FROM lookup(storeNames)
SELECT *
```

This query will select some specific attributes from that same table:

```sql
FROM lookup(storeNames)
SELECT store_ID, store_name, description
```

## Query examples [#query-with-data]

The primary benefit of lookup tables is that you can use queries that combine that data with your New Relic-stored telemetry data.

Here are some query examples:

<CollapserGroup>
<Collapser
id="avoid-hardcording"
title="Avoid hardcoding a long list of hosts"
>

This query avoids hardcoding a long list of hosts by querying host names contained in a lookup table:

```sql
FROM Log
SELECT count(*)
WHERE hostname IN (FROM lookup(myHosts) SELECT uniques(myHost))
```
</Collapser>

<Collapser
id="using-join"
title="Query using JOIN"
>

Using `JOIN` queries can make your data more understandable. For example, this query for a custom event type uses the `storeNames` table to show the store names along with the total sales. Also notice this allows a limit of 10,000: that's because lookup tables support a higher limit than other NRQL data types.

```sql
FROM StoreEvent
JOIN (FROM lookup(storeNames) SELECT store_ID as storeId, storeName AS name LIMIT 10000) ON shopId = storeId
SELECT shopId, storeName, totalSales
```
</Collapser>
<Collapser
id="status-codes"
title="Translate status codes"
>
Here's a query that translates status codes to readable summaries of the status:

```sql
FROM Transaction
JOIN (FROM lookup(statusCodeTable) SELECT status_code, status_summary, status_definition)
ON http.statusCode=status_code
SELECT count(*) FACET status_summary
```

Here are some example results:

<img
title="Screenshot of query for lookups translating status codes"
alt="Screenshot of query for lookups translating status codes"
src={nrqlExampleStatusCodes}
/>
</Collapser>

<Collapser
id="item-ids"
title="Translate item IDs"
>

This query shows how several NRQL features can work together to get business information from queries of log data. The query extracts information about items from log messages, and uses `JOIN` to get user-friendly item names and to generate a table of item names and the number of items stored.

```sql
WITH aparse(message, 'POST to carts: * body: {"itemId":"*","unitPrice":*}%') AS (URL, ItemID, Price)
FROM Log
JOIN (FROM lookup(itemNames) SELECT ItemID, itemName) ON ItemID
SELECT count(*) FACET itemName
WHERE message LIKE 'POST to carts%'
SINCE 30 days ago
```

Here are some example results:

<img
title="Screenshot of query for lookups translating item IDs"
alt="Screenshot of query for lookups translating item IDs"
src={nrqlLookupQueryTranslateItemIds}
/>

</Collapser>

<Collapser
id="geoip"
title="Use geographic info to analyze locations"
>

This query combines lookups and [GeoIP](/docs/logs/ui-data/parsing#geo) to find which locations have the most unsuccessful statuses:

```sql
FROM Log
JOIN (FROM lookup(statusCodeTable) SELECT status_code, status_summary, status_definition)
ON CacheResponseStatus=status_code
SELECT count(*) WHERE ClientIP.countryName IS NOT NULL AND status_summary != 'Success'
FACET ClientIP.countryName, status_summary, CacheResponseStatus
SINCE 1 day ago LIMIT MAX
```

Here are some example results:

<img
title="Screenshot of query for GeoIP info"
alt="Screenshot of query for GeoIP info"
src={nrqlLookupQueryGeoip}
/>

</Collapser>
</CollapserGroup>

## Technical details about querying [#limitations]

The maximum number of responses returned when using `LIMIT` is 20,000.

You can't use lookup data for NRQL alert conditions.

The following NRQL functions are not supported with lookup queries:

* `TIMESERIES`
* `COMPARE WITH`
* `EXTRAPOLATE`

But these clauses can be used if the lookup query is contained in an inner query. For an example of this, see [this query](#item-ids).
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
2 changes: 2 additions & 0 deletions src/nav/dashboards.yml
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,8 @@ pages:
path: /docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/query-infrastructure-dimensional-metrics-nrql
- title: NRQL subquery joins
path: /docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/subquery-joins
- title: Query data from lookup tables
path: /docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/lookups
- title: Mobile NRQL examples
path: /docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/nrql-query-examples-mobile-monitoring
- title: App data NRQL examples
Expand Down
2 changes: 2 additions & 0 deletions src/nav/logs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -97,6 +97,8 @@ pages:
path: /docs/logs/ui-data/data-partitions
- title: Log obfuscation
path: /docs/logs/ui-data/obfuscation-ui
- title: Lookup tables
path: /docs/logs/ui-data/lookup-tables-ui
- title: Drop filter rules
path: /docs/logs/ui-data/drop-data-drop-filter-rules
- title: Find data in long logs (blobs)
Expand Down

0 comments on commit e32b54f

Please sign in to comment.