Skip to content

Commit

Permalink
Merge 17dc8a9 into 79902d4
Browse files Browse the repository at this point in the history
  • Loading branch information
mcollina committed Jul 3, 2021
2 parents 79902d4 + 17dc8a9 commit 05a3363
Show file tree
Hide file tree
Showing 20 changed files with 935 additions and 52 deletions.
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,9 +75,10 @@ format logs during development:

Due to Node's single-threaded event-loop, it's highly recommended that sending,
alert triggering, reformatting and all forms of log processing
is conducted in a separate process. In Pino parlance we call all log processors
"transports", and recommend that the transports be run as separate
processes, piping the stdout of the application to the stdin of the transport.
be conducted in a separate process or thread.

In Pino terminology we call all log processors "transports", and recommend that the
transports be run in a worker thread using our `pino.transport` API.

For more details see our [Transports⇗](docs/transports.md) document.

Expand Down
93 changes: 93 additions & 0 deletions docs/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
* [logger.version](#version)
* [Statics](#statics)
* [pino.destination()](#pino-destination)
* [pino.transport()](#pino-transport)
* [pino.final()](#pino-final)
* [pino.multistream()](#pino-multistream)
* [pino.stdSerializers](#pino-stdserializers)
Expand Down Expand Up @@ -905,6 +906,98 @@ A `pino.destination` instance can also be used to reopen closed files
* See [Reopening log files](/docs/help.md#reopening)
* See [Asynchronous Logging ⇗](/docs/asynchronous.md)

<a id="pino-transport"></a>
### `pino.transport(options) => ThreadStream`

Create a a stream that routes logs to a worker thread that
wraps around a [Pino Transport](/docs/transports.md).

```js
const pino = require('pino')
const transport = pino.transport({
target: 'some-transport',
options: { some: 'options for', the: 'transport' }
})
pino(transport)
```

Multiple transports may also be defined, and specific levels can be logged to each transport:

```js
const pino = require('pino')
const transports = pino.transport([
{
level: 'info',
target: 'some-transport',
options: { some: 'options for', the: 'transport' }
},
{
level: 'trace',
target: '#pino/file',
options: { destination: '/path/to/store/logs' }
}
])
pino(transports)
```


For more on transports, how they work, and how to create them see the [`Transports documentation`](/docs/transports.md).

* See [`Transports`](/docs/transports.md)
* See [`thread-stream`](https://github.com/mcollina/thread-stream)

#### Options

* `target`: The transport to pass logs through. This may be an installed module name, an absolute path or a built-in transport (see [Transport Builtins](#transport-builtins))
* `options`: An options object which is serialized (see [Structured Clone Algorithm][https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm]), passed to the worker thread, parsed and then passed to the exported transport function.
* `worker`: [Worker thread](https://nodejs.org/api/worker_threads.html#worker_threads_new_worker_filename_options) configuration options. Additionally, the `worker` option supports `worker.autoEnd`. If this is set to `false` logs will not be flushed on process exit. It is then up to the developer to call `transport.end()` to flush logs.
* `targets`: May be specified instead of `target`, must be an array of transport configurations. Transport configurations include the aforementioned `options` and `target` options plus a `level` option which will send only logs above a specified level to a transport.

#### Transport Builtins

The `target` option may be an absolute path, a module name or builtin.

A transport builtin takes the form `#pino/<name>`.

The following transport builtins are supported:

##### `#pino/file`

The `#pino/file` builtin routes logs to a file (or file descriptor).

The `options.destination` property may be set to specify the desired file destination.

```js
const pino = require('pino')
const transport = pino.transport({
target: '#pino/file',
options: { destination: '/path/to/file' }
})
pino(transport)
```

The `options.destination` property may also be a number to represent a file descriptor. Typically this would be `1` to write to STDOUT or `2` to write to STDERR. If `options.destination` is not set, it defaults to `1` which means logs will be written to STDOUT.

The difference between using the `#pino/file` transport builtin and using `pino.destination` is that `pino.destination` runs in the main thread, whereas `#pino/file` sets up `pino.destination` in a worker thread.

##### ``#pino/pretty`

The `#pino/pretty` builtin prettifies logs.


By default the `#pino/pretty` builtin logs to STDOUT.

The `options.destination` property may be set to log pretty logs to a file descriptor or file. The following would send the prettified logs to STDERR:

```js
const pino = require('pino')
const transport = pino.transport({
target: '#pino/pretty',
options: { destination: 2 }
})
pino(transport)
```

<a id="pino-final"></a>

### `pino.final(logger, [handler]) => Function | FinalLogger`
Expand Down
168 changes: 126 additions & 42 deletions docs/transports.md
Original file line number Diff line number Diff line change
@@ -1,79 +1,163 @@
# Transports

A "transport" for Pino is a supplementary tool which consumes Pino logs.
Pino transports can be used for both transmitting and transforming log output.

Consider the following example:
The way Pino generates logs:

```js
const split = require('split2')
const pump = require('pump')
const through = require('through2')
1. Reduces the impact of logging on an application to the absolute minimum.
2. Gives greater flexibility in how logs are processed and stored.

It is recommended that any log transformation or transmission is performed either
in a seperate thread or a seperate process.

Prior to Pino v7 transports would ideally operate in a seperate process - these are
now referred to as [Legacy Transports](#legacy-transports).

From Pino v7 and upwards transports can also operate inside a [Worker Thread][worker-thread],
and can be used or configured via the options object passed to `pino` on initialization.

const myTransport = through.obj(function (chunk, enc, cb) {
// do the necessary
console.log(chunk)
cb()
})

pump(process.stdin, split(JSON.parse), myTransport)
[worker-thread]: https://nodejs.org/dist/latest-v14.x/docs/api/worker_threads.html

## v7+ Transports

A transport is a module that exports a default function which returns a writable stream:

```js
import { Writable } from 'stream'
export default (options) => {
const myTransportStream = new Writable({
write (chunk, enc, cb) {
// apply a transform and send to stdout
console.log(chunk.toString().toUpperCase())
cb()
}
})
return myTransportStream
}
```

The above defines our "transport" as the file `my-transport-process.js`.
Let's imagine the above defines our "transport" as the file `my-transport.mjs`
(ESM files are supported even if the project is written in CJS).

Logs can now be consumed using shell piping:
We would set up our transport by creating a transport stream with `pino.transport`
and passing it to the `pino` function:

```sh
node my-app-which-logs-stuff-to-stdout.js | node my-transport-process.js
```js
const pino = require('pino')
const transport = pino.transport({
target: '/absolute/path/to/my-transport.mjs'
})
pino(transport)
```

Ideally, a transport should consume logs in a separate process to the application,
Using transports in the same process causes unnecessary load and slows down
Node's single threaded event loop.
The transport code will be executed in a separate worker thread. The main thread
will write logs to the worker thread, which will write them to the stream returned
from the function exported from the transport file/module.

## In-process transports
The exported function can also be async. Imagine the following transport:

```js
import fs from 'fs'
import { once } from('events')
export default async (options) => {
const stream = fs.createWriteStream(opts.destination)
await once(stream, 'open')
return stream
}
```

> **Pino *does not* natively support in-process transports.**
While initializing the stream we're able to use `await` to perform asynchronous operations. In this
case waiting for the write streams `open` event.

Pino does not support in-process transports because Node processes are
single threaded processes (ignoring some technical details). Given this
restriction, one of the methods Pino employs to achieve its speed is to
purposefully offload the handling of logs, and their ultimate destination, to
external processes so that the threading capabilities of the OS can be
used (or other CPUs).
Let's imagine the above was published to npm with the module name `some-file-transport`.

One consequence of this methodology is that "error" logs do not get written to
`stderr`. However, since Pino logs are in a parsable format, it is possible to
use tools like [pino-tee][pino-tee] or [jq][jq] to work with the logs. For
example, to view only logs marked as "error" logs:
The `options.destination` value can be set when the creating the transport stream with `pino.transport` like so:

```js
const pino = require('pino')
const transport = pino.transport({
target: 'some-file-transport',
options: { destination: '/dev/null' }
})
pino(transport)
```
$ node an-app.js | jq 'select(.level == 50)'

Note here we've specified a module by package rather than by relative path. The options object we provide
is serialized and injected into the transport worker thread, then passed to the modules exported function.
This means that the options object can only contain types that are supported by the
[Structured Clone Algorithm][sca] which is used to (de)serializing objects between threads.

What if we wanted to use both transports, but send only error logs to `some-file-transport` while
sending all logs to `my-transport.mjs`. We can use the `pino.transport` function's `destinations` option:

```js
const pino = require('pino')
const transport = pino.transport({
destinations: [
{ target: '/absolute/path/to/my-transport.mjs', level: 'error' },
{ target: 'some-file-transport', options: { destination: '/dev/null' }
]
})
pino(transport)
```
In short, the way Pino generates logs:
For more details on `pino.transport` see the [API docs for `pino.transport`][pino-transport].
1. Reduces the impact of logging on an application to the absolute minimum.
2. Gives greater flexibility in how logs are processed and stored.
Given all of the above, Pino recommends out-of-process log processing.
[pino-transport]: /docs/api.md#pino-transport
[sca]: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm
## Legacy Transports
A legacy Pino "transport" is a supplementary tool which consumes Pino logs.
Consider the following example for creating a transport:
```js
const { pipeline, Writable } = require('stream')
const split = require('split2')

const myTransportStream = new Writable({
write (chunk, enc, cb) {
// apply a transform and send to stdout
console.log(chunk.toString().toUpperCase())
cb()
}
})

pipeline(process.stdin, split(JSON.parse), myTransportStream)
```
The above defines our "transport" as the file `my-transport-process.js`.
Logs can now be consumed using shell piping:
However, it is possible to wrap Pino and perform processing in-process.
For an example of this, see [pino-multi-stream][pinoms].
```sh
node my-app-which-logs-stuff-to-stdout.js | node my-transport-process.js
```
[pino-tee]: https://npm.im/pino-tee
[jq]: https://stedolan.github.io/jq/
[pinoms]: https://npm.im/pino-multi-stream
Ideally, a transport should consume logs in a separate process to the application,
Using transports in the same process causes unnecessary load and slows down
Node's single threaded event loop.
## Known Transports
PR's to this document are welcome for any new transports!
### Pino v7+ Compatible
+ [pino-elasticsearch](#pino-elasticsearch)
### Legacy
+ [pino-applicationinsights](#pino-applicationinsights)
+ [pino-azuretable](#pino-azuretable)
+ [pino-cloudwatch](#pino-cloudwatch)
+ [pino-couch](#pino-couch)
+ [pino-datadog](#pino-datadog)
+ [pino-elasticsearch](#pino-elasticsearch)
+ [pino-gelf](#pino-gelf)
+ [pino-http-send](#pino-http-send)
+ [pino-kafka](#pino-kafka)
Expand Down
2 changes: 1 addition & 1 deletion example.js → examples/basic.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
'use strict'

const pino = require('./')()
const pino = require('..')()

pino.info('hello world')
pino.error('this is at error level')
Expand Down

0 comments on commit 05a3363

Please sign in to comment.