Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin' into translations-f7eabbe
Browse files Browse the repository at this point in the history
  • Loading branch information
roadlittledawn committed Feb 3, 2022
2 parents 40ce40b + 0a1173b commit 7648189
Show file tree
Hide file tree
Showing 51 changed files with 2,041 additions and 1,479 deletions.
2 changes: 1 addition & 1 deletion src/content/docs/alerts-applied-intelligence/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ Use the correlate features to define your data sources and to review and configu
* **[Sources](/docs/alerts-applied-intelligence/applied-intelligence/incident-intelligence/get-started-incident-intelligence/#1-configure-sources)**: Manage the data input sources you've chosen to analyze and be notified about. You can add new sources or configure existing ones.
* **[Decisions](/docs/alerts-applied-intelligence/applied-intelligence/incident-intelligence/change-applied-intelligence-correlation-logic-decisions/)**: Review your correlated incidents and make decisions on how to respond to them. You can also review, edit, and add decisions.

## **Enrich and notify** with useful metadata [#enrich]
## **Enrich** your **notifications** with useful metadata [#enrich]

Use the enrich and notify features to add additional metadata to your notifications, schedule when you don't want to be notified, and configure where your notifications get sent.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -147,16 +147,6 @@ Dashboards
</td>
</tr>

<tr>
<td>
Data partitions
</td>

<td>
[Manage data partitions](/docs/apis/nerdgraph/examples/nerdgraph-data-partition-rules-tutorial)
</td>
</tr>

<tr>
<td>
Data management</td>
Expand Down Expand Up @@ -191,7 +181,6 @@ Entities
</td>
</tr>


<tr>
<td>
Key management
Expand All @@ -202,6 +191,17 @@ Key management
</td>
</tr>

<tr>
<td>
Log management
</td>

<td>
* [Manage data partitions](/docs/apis/nerdgraph/examples/nerdgraph-data-partition-rules-tutorial)
* [Manage log parsing rules](/docs/apis/nerdgraph/examples/nerdgraph-log-parsing-rules-tutorial/)
</td>
</tr>

<tr>
<td>
New Relic One apps </td>
Expand Down Expand Up @@ -259,8 +259,6 @@ Workloads
</td>
</tr>



</tbody>
</table>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -323,12 +323,36 @@ To pass the `-javaagent` argument on Play:

## Resin [#Installing_on_Resin]

To pass the `-javaagent` argument on Resin, add it to the `<jvm-args>` section in your `resin.conf` or `resin.xml` file:
**Java 8**

To pass the `-javaagent` argument on Resin, add it to the `<jvm-args>` section in your `conf/resin.conf` or `conf/resin.xml` file:

```
<jvm-arg>-javaagent:<var>/full/path/to/</var>newrelic.jar</jvm-arg>
```

**Java 9 or higher**

The module system introduced in Java 9 can lead to the exception `NoClassDefFoundError: java/sql/SQLException` if the `-javaagent` property is added to the `conf/resin.conf` or `conf/resin.xml` files. If you're using Java 9 or higher make sure that the `-javaagent` property is not included in those files.

Resin can be run on Java 9 or higher, with the Java agent using one of the following options:

* Option 1 - Specify the `-javaagent` property on command line:

```
java -javaagent:/path/to/newrelic.jar -jar lib/resin.jar start
```

* Option 2 - Add the `-javaagent` property to the `bin/resin.sh` script:

Add the `-javaagent` property directly to the `bin/resin.sh` script, modifying this line `exec $JAVA_EXE -jar lib/resin.jar $@` as follows:

```
exec $JAVA_EXE -javaagent:/path/to/newrelic/newrelic.jar -jar lib/resin.jar $@
```

The Resin server can then be run with `./bin/resin.sh start`.

## Solr [#Installing_on_solr]

To pass the `-javaagent` argument on Solr:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ redirects:

When you report data to New Relic, we process what we receive and apply data dropping and transformation rules. Then we count the bytes needed to represent your data in a standard format, like JSON. If you're on our [New Relic One pricing model](/docs/accounts/accounts-billing/new-relic-one-pricing-users/pricing-billing), you're charged for the number of bytes written to our database that are above and beyond the free per-month amount.

If you're trying to estimate the cost of your data ingest, see [Calculate data ingest](/docs/data-apis/manage-data/calculate-data-ingest).
If you're trying to estimate the cost of your data ingest, see [Estimate data ingest](/docs/data-apis/manage-data/calculate-data-ingest).

## Data ingestion UI [#ui]

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Collecting and storing data in New Relic allows you to analyze, visualize, and a

Our ingest process helps you hone your data. For example, data might arrive at our processing front door compressed and of varying quality. Through ingest, that data is uncompressed, decorated with queryable attributes, and evaluated. Elements are dropped or trimmed, all before we write it to NRDB. That way, the data you store is only the data you want most.

Want to estimate your data ingest and cost? See [Calculate data ingest](/docs/data-apis/manage-data/calculate-data-ingest).
Want to estimate your data ingest and cost? See [Estimate data ingest](/docs/data-apis/manage-data/calculate-data-ingest).

### Performance management

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ redirects:
- /docs/understand-dependencies/distributed-tracing/enable-configure/enable-distributed-tracing
- /docs/new-relic-solutions/best-practices-guides/full-stack-observability/best-practices-enable-distributed-tracing
- /docs/understand-dependencies/distributed-tracing/enable-configure/language-agents-enable-distributed-tracing
---
---
import clogo from './images/clogo.png'

import gologo from './images/gologo.png'
Expand Down Expand Up @@ -159,7 +159,7 @@ For more help finding your traces in the UI:

## Set up Infinite Tracing (advanced option) [#infinite-tracing]

Standard distributed tracing for APM agents (above) captures up to 10% of your traces, but if you want us to analyze all your data and find the most relevant traces, you can set up Infinite Tracing. This alternative to standard distributed tracing is available for all APM language agents except C SDK.
Standard distributed tracing for APM agents (above) use [adaptive sampling](/docs/distributed-tracing/concepts/how-new-relic-distributed-tracing-works/#trace-origin-sampling) to capture up to 10 traces per minute, but if you want us to analyze all your data and find the most relevant traces, you can set up Infinite Tracing. This alternative to standard distributed tracing is available for all APM language agents except C SDK.

<Callout variant="tip">
To learn more about this feature, see [Infinite Tracing](/docs/understand-dependencies/distributed-tracing/infinite-tracing/introduction-infinite-tracing).
Expand Down

0 comments on commit 7648189

Please sign in to comment.