fatsify核心功能示例测试!!!
This commit is contained in:
1510
node_modules/pino/docs/api.md
generated
vendored
Normal file
1510
node_modules/pino/docs/api.md
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
40
node_modules/pino/docs/asynchronous.md
generated
vendored
Normal file
40
node_modules/pino/docs/asynchronous.md
generated
vendored
Normal file
@@ -0,0 +1,40 @@
|
||||
# Asynchronous Logging
|
||||
|
||||
Asynchronous logging enables the minimum overhead of Pino.
|
||||
Asynchronous logging works by buffering log messages and writing them in larger chunks.
|
||||
|
||||
```js
|
||||
const pino = require('pino')
|
||||
const logger = pino(pino.destination({
|
||||
dest: './my-file', // omit for stdout
|
||||
minLength: 4096, // Buffer before writing
|
||||
sync: false // Asynchronous logging
|
||||
}))
|
||||
```
|
||||
|
||||
It's always possible to turn on synchronous logging by passing `sync: true`.
|
||||
In this mode of operation, log messages are directly written to the
|
||||
output stream as the messages are generated with a _blocking_ operation.
|
||||
|
||||
* See [`pino.destination`](/docs/api.md#pino-destination)
|
||||
* `pino.destination` is implemented on [`sonic-boom` ⇗](https://github.com/mcollina/sonic-boom).
|
||||
|
||||
### AWS Lambda
|
||||
|
||||
Asynchronous logging is disabled by default on AWS Lambda or any other environment
|
||||
that modifies `process.stdout`. If forcefully turned on, we recommend calling `dest.flushSync()` at the end
|
||||
of each function execution to avoid losing data.
|
||||
|
||||
## Caveats
|
||||
|
||||
Asynchronous logging has a couple of important caveats:
|
||||
|
||||
* As opposed to the synchronous mode, there is not a one-to-one relationship between
|
||||
calls to logging methods (e.g. `logger.info`) and writes to a log file
|
||||
* There is a possibility of the most recently buffered log messages being lost
|
||||
in case of a system failure, e.g. a power cut.
|
||||
|
||||
See also:
|
||||
|
||||
* [`pino.destination` API](/docs/api.md#pino-destination)
|
||||
* [`destination` parameter](/docs/api.md#destination)
|
||||
55
node_modules/pino/docs/benchmarks.md
generated
vendored
Normal file
55
node_modules/pino/docs/benchmarks.md
generated
vendored
Normal file
@@ -0,0 +1,55 @@
|
||||
|
||||
# Benchmarks
|
||||
|
||||
`pino.info('hello world')`:
|
||||
|
||||
```
|
||||
|
||||
BASIC benchmark averages
|
||||
Bunyan average: 377.434ms
|
||||
Winston average: 270.249ms
|
||||
Bole average: 172.690ms
|
||||
Debug average: 220.527ms
|
||||
LogLevel average: 222.802ms
|
||||
Pino average: 114.801ms
|
||||
PinoMinLength average: 70.968ms
|
||||
PinoNodeStream average: 159.192ms
|
||||
|
||||
```
|
||||
|
||||
`pino.info({'hello': 'world'})`:
|
||||
|
||||
```
|
||||
|
||||
OBJECT benchmark averages
|
||||
BunyanObj average: 410.379ms
|
||||
WinstonObj average: 273.120ms
|
||||
BoleObj average: 185.069ms
|
||||
LogLevelObject average: 433.425ms
|
||||
PinoObj average: 119.315ms
|
||||
PinoMinLengthObj average: 76.968ms
|
||||
PinoNodeStreamObj average: 164.268ms
|
||||
|
||||
```
|
||||
|
||||
`pino.info(aBigDeeplyNestedObject)`:
|
||||
|
||||
```
|
||||
|
||||
DEEP-OBJECT benchmark averages
|
||||
BunyanDeepObj average: 1.839ms
|
||||
WinstonDeepObj average: 5.604ms
|
||||
BoleDeepObj average: 3.422ms
|
||||
LogLevelDeepObj average: 11.716ms
|
||||
PinoDeepObj average: 2.256ms
|
||||
PinoMinLengthDeepObj average: 2.240ms
|
||||
PinoNodeStreamDeepObj average: 2.595ms
|
||||
|
||||
```
|
||||
|
||||
`pino.info('hello %s %j %d', 'world', {obj: true}, 4, {another: 'obj'})`:
|
||||
|
||||
For a fair comparison, [LogLevel](http://npm.im/loglevel) was extended
|
||||
to include a timestamp and [bole](http://npm.im/bole) had
|
||||
`fastTime` mode switched on.
|
||||
|
||||
242
node_modules/pino/docs/browser.md
generated
vendored
Normal file
242
node_modules/pino/docs/browser.md
generated
vendored
Normal file
@@ -0,0 +1,242 @@
|
||||
# Browser API
|
||||
|
||||
Pino is compatible with [`browserify`](https://npm.im/browserify) for browser-side usage:
|
||||
|
||||
This can be useful with isomorphic/universal JavaScript code.
|
||||
|
||||
By default, in the browser,
|
||||
`pino` uses corresponding [Log4j](https://en.wikipedia.org/wiki/Log4j) `console` methods (`console.error`, `console.warn`, `console.info`, `console.debug`, `console.trace`) and uses `console.error` for any `fatal` level logs.
|
||||
|
||||
## Options
|
||||
|
||||
Pino can be passed a `browser` object in the options object,
|
||||
which can have the following properties:
|
||||
|
||||
### `asObject` (Boolean)
|
||||
|
||||
```js
|
||||
const pino = require('pino')({browser: {asObject: true}})
|
||||
```
|
||||
|
||||
The `asObject` option will create a pino-like log object instead of
|
||||
passing all arguments to a console method, for instance:
|
||||
|
||||
```js
|
||||
pino.info('hi') // creates and logs {msg: 'hi', level: 30, time: <ts>}
|
||||
```
|
||||
|
||||
When `write` is set, `asObject` will always be `true`.
|
||||
|
||||
### `asObjectBindingsOnly` (Boolean)
|
||||
|
||||
```js
|
||||
const pino = require('pino')({browser: {asObjectBindingsOnly: true}})
|
||||
```
|
||||
|
||||
The `asObjectBindingsOnly` option is similar to `asObject` but will keep the message
|
||||
and arguments unformatted. This allows to defer formatting the message to the
|
||||
actual call to `console` methods, where browsers then have richer formatting in
|
||||
their devtools than when pino will format the message to a string first.
|
||||
|
||||
```js
|
||||
pino.info('hello %s', 'world') // creates and logs {level: 30, time: <ts>}, 'hello %s', 'world'
|
||||
```
|
||||
|
||||
### `formatters` (Object)
|
||||
|
||||
An object containing functions for formatting the shape of the log lines. When provided, it enables the logger to produce a pino-like log object with customized formatting. Currently, it supports formatting for the `level` object only.
|
||||
|
||||
##### `level`
|
||||
|
||||
Changes the shape of the log level. The default shape is `{ level: number }`.
|
||||
The function takes two arguments, the label of the level (e.g. `'info'`)
|
||||
and the numeric value (e.g. `30`).
|
||||
|
||||
```js
|
||||
const formatters = {
|
||||
level (label, number) {
|
||||
return { level: number }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
### `write` (Function | Object)
|
||||
|
||||
Instead of passing log messages to `console.log` they can be passed to
|
||||
a supplied function.
|
||||
|
||||
If `write` is set to a single function, all logging objects are passed
|
||||
to this function.
|
||||
|
||||
```js
|
||||
const pino = require('pino')({
|
||||
browser: {
|
||||
write: (o) => {
|
||||
// do something with o
|
||||
}
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
If `write` is an object, it can have methods that correspond to the
|
||||
levels. When a message is logged at a given level, the corresponding
|
||||
method is called. If a method isn't present, the logging falls back
|
||||
to using the `console`.
|
||||
|
||||
|
||||
```js
|
||||
const pino = require('pino')({
|
||||
browser: {
|
||||
write: {
|
||||
info: function (o) {
|
||||
//process info log object
|
||||
},
|
||||
error: function (o) {
|
||||
//process error log object
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
### `serialize`: (Boolean | Array)
|
||||
|
||||
The serializers provided to `pino` are ignored by default in the browser, including
|
||||
the standard serializers provided with Pino. Since the default destination for log
|
||||
messages is the console, values such as `Error` objects are enhanced for inspection,
|
||||
which they otherwise wouldn't be if the Error serializer was enabled.
|
||||
|
||||
We can turn all serializers on,
|
||||
|
||||
```js
|
||||
const pino = require('pino')({
|
||||
browser: {
|
||||
serialize: true
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
Or we can selectively enable them via an array:
|
||||
|
||||
```js
|
||||
const pino = require('pino')({
|
||||
serializers: {
|
||||
custom: myCustomSerializer,
|
||||
another: anotherSerializer
|
||||
},
|
||||
browser: {
|
||||
serialize: ['custom']
|
||||
}
|
||||
})
|
||||
// following will apply myCustomSerializer to the custom property,
|
||||
// but will not apply anotherSerializer to another key
|
||||
pino.info({custom: 'a', another: 'b'})
|
||||
```
|
||||
|
||||
When `serialize` is `true` the standard error serializer is also enabled (see https://github.com/pinojs/pino/blob/master/docs/api.md#stdSerializers).
|
||||
This is a global serializer, which will apply to any `Error` objects passed to the logger methods.
|
||||
|
||||
If `serialize` is an array the standard error serializer is also automatically enabled, it can
|
||||
be explicitly disabled by including a string in the serialize array: `!stdSerializers.err`, like so:
|
||||
|
||||
```js
|
||||
const pino = require('pino')({
|
||||
serializers: {
|
||||
custom: myCustomSerializer,
|
||||
another: anotherSerializer
|
||||
},
|
||||
browser: {
|
||||
serialize: ['!stdSerializers.err', 'custom'] //will not serialize Errors, will serialize `custom` keys
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
The `serialize` array also applies to any child logger serializers (see https://github.com/pinojs/pino/blob/master/docs/api.md#discussion-2
|
||||
for how to set child-bound serializers).
|
||||
|
||||
Unlike server pino the serializers apply to every object passed to the logger method,
|
||||
if the `asObject` option is `true`, this results in the serializers applying to the
|
||||
first object (as in server pino).
|
||||
|
||||
For more info on serializers see https://github.com/pinojs/pino/blob/master/docs/api.md#mergingobject.
|
||||
|
||||
### `transmit` (Object)
|
||||
|
||||
An object with `send` and `level` properties.
|
||||
|
||||
The `transmit.level` property specifies the minimum level (inclusive) of when the `send` function
|
||||
should be called, if not supplied the `send` function be called based on the main logging `level`
|
||||
(set via `options.level`, defaulting to `info`).
|
||||
|
||||
The `transmit` object must have a `send` function which will be called after
|
||||
writing the log message. The `send` function is passed the level of the log
|
||||
message and a `logEvent` object.
|
||||
|
||||
The `logEvent` object is a data structure representing a log message, it represents
|
||||
the arguments passed to a logger statement, the level
|
||||
at which they were logged, and the hierarchy of child bindings.
|
||||
|
||||
The `logEvent` format is structured like so:
|
||||
|
||||
```js
|
||||
{
|
||||
ts = Number,
|
||||
messages = Array,
|
||||
bindings = Array,
|
||||
level: { label = String, value = Number}
|
||||
}
|
||||
```
|
||||
|
||||
The `ts` property is a Unix epoch timestamp in milliseconds, the time is taken from the moment the
|
||||
logger method is called.
|
||||
|
||||
The `messages` array is all arguments passed to logger method, (for instance `logger.info('a', 'b', 'c')`
|
||||
would result in `messages` array `['a', 'b', 'c']`).
|
||||
|
||||
The `bindings` array represents each child logger (if any), and the relevant bindings.
|
||||
For instance, given `logger.child({a: 1}).child({b: 2}).info({c: 3})`, the bindings array
|
||||
would hold `[{a: 1}, {b: 2}]` and the `messages` array would be `[{c: 3}]`. The `bindings`
|
||||
are ordered according to their position in the child logger hierarchy, with the lowest index
|
||||
being the top of the hierarchy.
|
||||
|
||||
By default, serializers are not applied to log output in the browser, but they will *always* be
|
||||
applied to `messages` and `bindings` in the `logEvent` object. This allows us to ensure a consistent
|
||||
format for all values between server and client.
|
||||
|
||||
The `level` holds the label (for instance `info`), and the corresponding numerical value
|
||||
(for instance `30`). This could be important in cases where client-side level values and
|
||||
labels differ from server-side.
|
||||
|
||||
The point of the `send` function is to remotely record log messages:
|
||||
|
||||
```js
|
||||
const pino = require('pino')({
|
||||
browser: {
|
||||
transmit: {
|
||||
level: 'warn',
|
||||
send: function (level, logEvent) {
|
||||
if (level === 'warn') {
|
||||
// maybe send the logEvent to a separate endpoint
|
||||
// or maybe analyze the messages further before sending
|
||||
}
|
||||
// we could also use the `logEvent.level.value` property to determine
|
||||
// numerical value
|
||||
if (logEvent.level.value >= 50) { // covers error and fatal
|
||||
|
||||
// send the logEvent somewhere
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
### `disabled` (Boolean)
|
||||
|
||||
```js
|
||||
const pino = require('pino')({browser: {disabled: true}})
|
||||
```
|
||||
|
||||
The `disabled` option will disable logging in browser if set
|
||||
to `true`, by default it is set to `false`.
|
||||
40
node_modules/pino/docs/bundling.md
generated
vendored
Normal file
40
node_modules/pino/docs/bundling.md
generated
vendored
Normal file
@@ -0,0 +1,40 @@
|
||||
# Bundling
|
||||
|
||||
Due to its internal architecture based on Worker Threads, it is not possible to bundle Pino *without* generating additional files.
|
||||
|
||||
In particular, a bundler must ensure that the following files are also bundled separately:
|
||||
|
||||
* `lib/worker.js` from the `thread-stream` dependency
|
||||
* `file.js`
|
||||
* `lib/worker.js`
|
||||
* Any transport used by the user (like `pino-pretty`)
|
||||
|
||||
Once the files above have been generated, the bundler must also add information about the files above by injecting a code that sets `__bundlerPathsOverrides` in the `globalThis` object.
|
||||
|
||||
The variable is an object whose keys are an identifier for the files and the values are the paths of files relative to the currently bundle files.
|
||||
|
||||
Example:
|
||||
|
||||
```javascript
|
||||
// Inject this using your bundle plugin
|
||||
globalThis.__bundlerPathsOverrides = {
|
||||
'thread-stream-worker': pinoWebpackAbsolutePath('./thread-stream-worker.js')
|
||||
'pino/file': pinoWebpackAbsolutePath('./pino-file.js'),
|
||||
'pino-worker': pinoWebpackAbsolutePath('./pino-worker.js'),
|
||||
'pino-pretty': pinoWebpackAbsolutePath('./pino-pretty.js'),
|
||||
};
|
||||
```
|
||||
|
||||
Note that `pino/file`, `pino-worker` and `thread-stream-worker` are required identifiers. Other identifiers are possible based on the user configuration.
|
||||
|
||||
## Webpack Plugin
|
||||
|
||||
If you are a Webpack user, you can achieve this with [pino-webpack-plugin](https://github.com/pinojs/pino-webpack-plugin) without manual configuration of `__bundlerPathsOverrides`; however, you still need to configure it manually if you are using other bundlers.
|
||||
|
||||
## Esbuild Plugin
|
||||
|
||||
[esbuild-plugin-pino](https://github.com/davipon/esbuild-plugin-pino) is the esbuild plugin to generate extra pino files for bundling.
|
||||
|
||||
## Bun Plugin
|
||||
|
||||
[bun-plugin-pino](https://github.com/vktrl/bun-plugin-pino) is the Bun plugin to generate extra pino files for bundling.
|
||||
95
node_modules/pino/docs/child-loggers.md
generated
vendored
Normal file
95
node_modules/pino/docs/child-loggers.md
generated
vendored
Normal file
@@ -0,0 +1,95 @@
|
||||
# Child loggers
|
||||
|
||||
Let's assume we want to have `"module":"foo"` added to every log within a
|
||||
module `foo.js`.
|
||||
|
||||
To accomplish this, simply use a child logger:
|
||||
|
||||
```js
|
||||
'use strict'
|
||||
// imports a pino logger instance of `require('pino')()`
|
||||
const parentLogger = require('./lib/logger')
|
||||
const log = parentLogger.child({module: 'foo'})
|
||||
|
||||
function doSomething () {
|
||||
log.info('doSomething invoked')
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
doSomething
|
||||
}
|
||||
```
|
||||
|
||||
## Cost of child logging
|
||||
|
||||
Child logger creation is fast:
|
||||
|
||||
```
|
||||
benchBunyanCreation*10000: 564.514ms
|
||||
benchBoleCreation*10000: 283.276ms
|
||||
benchPinoCreation*10000: 258.745ms
|
||||
benchPinoExtremeCreation*10000: 150.506ms
|
||||
```
|
||||
|
||||
Logging through a child logger has little performance penalty:
|
||||
|
||||
```
|
||||
benchBunyanChild*10000: 556.275ms
|
||||
benchBoleChild*10000: 288.124ms
|
||||
benchPinoChild*10000: 231.695ms
|
||||
benchPinoExtremeChild*10000: 122.117ms
|
||||
```
|
||||
|
||||
Logging via the child logger of a child logger also has negligible overhead:
|
||||
|
||||
```
|
||||
benchBunyanChildChild*10000: 559.082ms
|
||||
benchPinoChildChild*10000: 229.264ms
|
||||
benchPinoExtremeChildChild*10000: 127.753ms
|
||||
```
|
||||
|
||||
## Duplicate keys caveat
|
||||
|
||||
Naming conflicts can arise between child loggers and
|
||||
children of child loggers.
|
||||
|
||||
This isn't as bad as it sounds, even if the same keys between
|
||||
parent and child loggers are used, Pino resolves the conflict in the sanest way.
|
||||
|
||||
For example, consider the following:
|
||||
|
||||
```js
|
||||
const pino = require('pino')
|
||||
pino(pino.destination('./my-log'))
|
||||
.child({a: 'property'})
|
||||
.child({a: 'prop'})
|
||||
.info('howdy')
|
||||
```
|
||||
|
||||
```sh
|
||||
$ cat my-log
|
||||
{"pid":95469,"hostname":"MacBook-Pro-3.home","level":30,"msg":"howdy","time":1459534114473,"a":"property","a":"prop"}
|
||||
```
|
||||
|
||||
Notice how there are two keys named `a` in the JSON output. The sub-child's properties
|
||||
appear after the parent child properties.
|
||||
|
||||
At some point, the logs will most likely be processed (for instance with a [transport](transports.md)),
|
||||
and this generally involves parsing. `JSON.parse` will return an object where the conflicting
|
||||
namespace holds the final value assigned to it:
|
||||
|
||||
```sh
|
||||
$ cat my-log | node -e "process.stdin.once('data', (line) => console.log(JSON.stringify(JSON.parse(line))))"
|
||||
{"pid":95469,"hostname":"MacBook-Pro-3.home","level":30,"msg":"howdy","time":"2016-04-01T18:08:34.473Z","a":"prop"}
|
||||
```
|
||||
|
||||
Ultimately the conflict is resolved by taking the last value, which aligns with Bunyan's child logging
|
||||
behavior.
|
||||
|
||||
There may be cases where this edge case becomes problematic if a JSON parser with alternative behavior
|
||||
is used to process the logs. It's recommended to be conscious of namespace conflicts with child loggers,
|
||||
in light of an expected log processing approach.
|
||||
|
||||
One of Pino's performance tricks is to avoid building objects and stringifying
|
||||
them, so we're building strings instead. This is why duplicate keys between
|
||||
parents and children will end up in the log output.
|
||||
16
node_modules/pino/docs/diagnostics.md
generated
vendored
Normal file
16
node_modules/pino/docs/diagnostics.md
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
# Diagnostics
|
||||
|
||||
Pino provides [tracing channel](tc) events that allow insight into the
|
||||
internal workings of the library. The currently supported events are:
|
||||
|
||||
+ `tracing:pino_asJson:start`: emitted when the final serialization process
|
||||
of logs is started. The emitted event payload has the following fields:
|
||||
- `instance`: the Pino instance associated with the function
|
||||
- `arguments`: the arguments passed to the function
|
||||
+ `tracing:pino_asJson:end`: emitted at the end of the final serialization
|
||||
process. The emitted event payload has the following fields:
|
||||
- `instance`: the Pino instance associated with the function
|
||||
- `arguments`: the arguments passed to the function
|
||||
- `result`: the finalized, newline delimited, log line as a string
|
||||
|
||||
[tc]: https://nodejs.org/docs/latest/api/diagnostics_channel.html#tracingchannel-channels
|
||||
85
node_modules/pino/docs/ecosystem.md
generated
vendored
Normal file
85
node_modules/pino/docs/ecosystem.md
generated
vendored
Normal file
@@ -0,0 +1,85 @@
|
||||
# Pino Ecosystem
|
||||
|
||||
This is a list of ecosystem modules that integrate with `pino`.
|
||||
|
||||
Modules listed under [Core](#core) are maintained by the Pino team. Modules
|
||||
listed under [Community](#community) are maintained by independent community
|
||||
members.
|
||||
|
||||
Please send a PR to add new modules!
|
||||
|
||||
<a id="core"></a>
|
||||
## Core
|
||||
|
||||
### Frameworks
|
||||
+ [`express-pino-logger`](https://github.com/pinojs/express-pino-logger): use
|
||||
Pino to log requests within [express](https://expressjs.com/).
|
||||
+ [`koa-pino-logger`](https://github.com/pinojs/koa-pino-logger): use Pino to
|
||||
log requests within [Koa](https://koajs.com/).
|
||||
+ [`restify-pino-logger`](https://github.com/pinojs/restify-pino-logger): use
|
||||
Pino to log requests within [restify](http://restify.com/).
|
||||
+ [`rill-pino-logger`](https://github.com/pinojs/rill-pino-logger): use Pino as
|
||||
the logger for the [Rill framework](https://rill.site/).
|
||||
|
||||
### Utilities
|
||||
+ [`pino-arborsculpture`](https://github.com/pinojs/pino-arborsculpture): change
|
||||
log levels at runtime.
|
||||
+ [`pino-caller`](https://github.com/pinojs/pino-caller): add callsite to the log line.
|
||||
+ [`pino-clf`](https://github.com/pinojs/pino-clf): reformat Pino logs into
|
||||
Common Log Format.
|
||||
+ [`pino-console`](https://github.com/pinojs/pino-console): adapter for the [WHATWG Console](https://console.spec.whatwg.org/) spec.
|
||||
+ [`pino-debug`](https://github.com/pinojs/pino-debug): use Pino to interpret
|
||||
[`debug`](https://npm.im/debug) logs.
|
||||
+ [`pino-elasticsearch`](https://github.com/pinojs/pino-elasticsearch): send
|
||||
Pino logs to an Elasticsearch instance.
|
||||
+ [`pino-eventhub`](https://github.com/pinojs/pino-eventhub): send Pino logs
|
||||
to an [Event Hub](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-what-is-event-hubs).
|
||||
+ [`pino-filter`](https://github.com/pinojs/pino-filter): filter Pino logs in
|
||||
the same fashion as the [`debug`](https://npm.im/debug) module.
|
||||
+ [`pino-gelf`](https://github.com/pinojs/pino-gelf): reformat Pino logs into
|
||||
GELF format for Graylog.
|
||||
+ [`pino-hapi`](https://github.com/pinojs/hapi-pino): use Pino as the logger
|
||||
for [Hapi](https://hapijs.com/).
|
||||
+ [`pino-http`](https://github.com/pinojs/pino-http): easily use Pino to log
|
||||
requests with the core `http` module.
|
||||
+ [`pino-http-print`](https://github.com/pinojs/pino-http-print): reformat Pino
|
||||
logs into traditional [HTTPD](https://httpd.apache.org/) style request logs.
|
||||
+ [`pino-mongodb`](https://github.com/pinojs/pino-mongodb): store Pino logs
|
||||
in a MongoDB database.
|
||||
+ [`pino-multi-stream`](https://github.com/pinojs/pino-multi-stream): send
|
||||
logs to multiple destination streams (slow!).
|
||||
+ [`pino-noir`](https://github.com/pinojs/pino-noir): redact sensitive information
|
||||
in logs.
|
||||
+ [`pino-pretty`](https://github.com/pinojs/pino-pretty): basic prettifier to
|
||||
make log lines human-readable.
|
||||
+ [`pino-socket`](https://github.com/pinojs/pino-socket): send logs to TCP or UDP
|
||||
destinations.
|
||||
+ [`pino-std-serializers`](https://github.com/pinojs/pino-std-serializers): the
|
||||
core object serializers used within Pino.
|
||||
+ [`pino-syslog`](https://github.com/pinojs/pino-syslog): reformat Pino logs
|
||||
to standard syslog format.
|
||||
+ [`pino-tee`](https://github.com/pinojs/pino-tee): pipe Pino logs into files
|
||||
based upon log levels.
|
||||
+ [`pino-test`](https://github.com/pinojs/pino-test): a set of utilities for
|
||||
verifying logs generated by the Pino logger.
|
||||
+ [`pino-toke`](https://github.com/pinojs/pino-toke): reformat Pino logs
|
||||
according to a given format string.
|
||||
|
||||
|
||||
<a id="community"></a>
|
||||
## Community
|
||||
|
||||
+ [`@google-cloud/pino-logging-gcp-config`](https://www.npmjs.com/package/@google-cloud/pino-logging-gcp-config): Config helper and formatter to output [Google Cloud Platform Structured Logging](https://cloud.google.com/logging/docs/structured-logging)
|
||||
+ [`@newrelic/pino-enricher`](https://github.com/newrelic/newrelic-node-log-extensions/blob/main/packages/pino-log-enricher): a log customization to add New Relic context to use [Logs In Context](https://docs.newrelic.com/docs/logs/logs-context/logs-in-context/)
|
||||
+ [`cloud-pine`](https://github.com/metcoder95/cloud-pine): transport that provides abstraction and compatibility with [`@google-cloud/logging`](https://www.npmjs.com/package/@google-cloud/logging).
|
||||
+ [`cls-proxify`](https://github.com/keenondrums/cls-proxify): integration of pino and [CLS](https://github.com/jeff-lewis/cls-hooked). Useful for creating dynamically configured child loggers (e.g. with added trace ID) for each request.
|
||||
+ [`crawlee-pino`](https://github.com/imyelo/crawlee-pino): use Pino to log within Crawlee
|
||||
+ [`pino-colada`](https://github.com/lrlna/pino-colada): cute ndjson formatter for pino.
|
||||
+ [`pino-dev`](https://github.com/dnjstrom/pino-dev): simple prettifier for pino with built-in support for common ecosystem packages.
|
||||
+ [`pino-fluentd`](https://github.com/davidedantonio/pino-fluentd): send Pino logs to Elasticsearch,
|
||||
MongoDB, and many [others](https://www.fluentd.org/dataoutputs) via Fluentd.
|
||||
+ [`pino-lambda`](https://github.com/FormidableLabs/pino-lambda): log transport for cloudwatch support inside aws-lambda
|
||||
+ [`pino-pretty-min`](https://github.com/unjello/pino-pretty-min): a minimal
|
||||
prettifier inspired by the [logrus](https://github.com/sirupsen/logrus) logger.
|
||||
+ [`pino-rotating-file`](https://github.com/homeaway/pino-rotating-file): a hapi-pino log transport for splitting logs into separate, automatically rotating files.
|
||||
+ [`pino-tiny`](https://github.com/holmok/pino-tiny): a tiny (and extensible?) little log formatter for pino.
|
||||
345
node_modules/pino/docs/help.md
generated
vendored
Normal file
345
node_modules/pino/docs/help.md
generated
vendored
Normal file
@@ -0,0 +1,345 @@
|
||||
# Help
|
||||
|
||||
* [Log rotation](#rotate)
|
||||
* [Reopening log files](#reopening)
|
||||
* [Saving to multiple files](#multiple)
|
||||
* [Log filtering](#filter-logs)
|
||||
* [Transports and systemd](#transport-systemd)
|
||||
* [Log to different streams](#multi-stream)
|
||||
* [Duplicate keys](#dupe-keys)
|
||||
* [Log levels as labels instead of numbers](#level-string)
|
||||
* [Pino with `debug`](#debug)
|
||||
* [Unicode and Windows terminal](#windows)
|
||||
* [Mapping Pino Log Levels to Google Cloud Logging (Stackdriver) Severity Levels](#stackdriver)
|
||||
* [Using Grafana Loki to evaluate pino logs in a kubernetes cluster](#grafana-loki)
|
||||
* [Avoid Message Conflict](#avoid-message-conflict)
|
||||
* [Best performance for logging to `stdout`](#best-performance-for-stdout)
|
||||
* [Testing](#testing)
|
||||
|
||||
<a id="rotate"></a>
|
||||
## Log rotation
|
||||
|
||||
Use a separate tool for log rotation:
|
||||
We recommend [logrotate](https://github.com/logrotate/logrotate).
|
||||
Consider we output our logs to `/var/log/myapp.log` like so:
|
||||
|
||||
```
|
||||
$ node server.js > /var/log/myapp.log
|
||||
```
|
||||
|
||||
We would rotate our log files with logrotate, by adding the following to `/etc/logrotate.d/myapp`:
|
||||
|
||||
```
|
||||
/var/log/myapp.log {
|
||||
su root
|
||||
daily
|
||||
rotate 7
|
||||
delaycompress
|
||||
compress
|
||||
notifempty
|
||||
missingok
|
||||
copytruncate
|
||||
}
|
||||
```
|
||||
|
||||
The `copytruncate` configuration has a very slight possibility of lost log lines due
|
||||
to a gap between copying and truncating - the truncate may occur after additional lines
|
||||
have been written. To perform log rotation without `copytruncate`, see the [Reopening log files](#reopening)
|
||||
help.
|
||||
|
||||
<a id="reopening"></a>
|
||||
## Reopening log files
|
||||
|
||||
In cases where a log rotation tool doesn't offer copy-truncate capabilities,
|
||||
or where using them is deemed inappropriate, `pino.destination`
|
||||
can reopen file paths after a file has been moved away.
|
||||
|
||||
One way to use this is to set up a `SIGUSR2` or `SIGHUP` signal handler that
|
||||
reopens the log file destination, making sure to write the process PID out
|
||||
somewhere so the log rotation tool knows where to send the signal.
|
||||
|
||||
```js
|
||||
// write the process pid to a well known location for later
|
||||
const fs = require('node:fs')
|
||||
fs.writeFileSync('/var/run/myapp.pid', process.pid)
|
||||
|
||||
const dest = pino.destination('/log/file')
|
||||
const logger = require('pino')(dest)
|
||||
process.on('SIGHUP', () => dest.reopen())
|
||||
```
|
||||
|
||||
The log rotation tool can then be configured to send this signal to the process
|
||||
after a log rotation event has occurred.
|
||||
|
||||
Given a similar scenario as in the [Log rotation](#rotate) section a basic
|
||||
`logrotate` config that aligns with this strategy would look similar to the following:
|
||||
|
||||
```
|
||||
/var/log/myapp.log {
|
||||
su root
|
||||
daily
|
||||
rotate 7
|
||||
delaycompress
|
||||
compress
|
||||
notifempty
|
||||
missingok
|
||||
postrotate
|
||||
kill -HUP `cat /var/run/myapp.pid`
|
||||
endscript
|
||||
}
|
||||
```
|
||||
|
||||
<a id="multiple"></a>
|
||||
## Saving to multiple files
|
||||
|
||||
See [`pino.multistream`](/docs/api.md#pino-multistream).
|
||||
|
||||
<a id="filter-logs"></a>
|
||||
## Log Filtering
|
||||
The Pino philosophy advocates common, preexisting, system utilities.
|
||||
|
||||
Some recommendations in line with this philosophy are:
|
||||
|
||||
1. Use [`grep`](https://linux.die.net/man/1/grep):
|
||||
```sh
|
||||
$ # View all "INFO" level logs
|
||||
$ node app.js | grep '"level":30'
|
||||
```
|
||||
1. Use [`jq`](https://stedolan.github.io/jq/):
|
||||
```sh
|
||||
$ # View all "ERROR" level logs
|
||||
$ node app.js | jq 'select(.level == 50)'
|
||||
```
|
||||
|
||||
<a id="transport-systemd"></a>
|
||||
## Transports and systemd
|
||||
`systemd` makes it complicated to use pipes in services. One method for overcoming
|
||||
this challenge is to use a subshell:
|
||||
|
||||
```
|
||||
ExecStart=/bin/sh -c '/path/to/node app.js | pino-transport'
|
||||
```
|
||||
|
||||
<a id="multi-stream"></a>
|
||||
## Log to different streams
|
||||
|
||||
Pino's default log destination is the singular destination of `stdout`. While
|
||||
not recommended for performance reasons, multiple destinations can be targeted
|
||||
by using [`pino.multistream`](/docs/api.md#pino-multistream).
|
||||
|
||||
In this example, we use `stderr` for `error` level logs and `stdout` as default
|
||||
for all other levels (e.g. `debug`, `info`, and `warn`).
|
||||
|
||||
```js
|
||||
const pino = require('pino')
|
||||
var streams = [
|
||||
{level: 'debug', stream: process.stdout},
|
||||
{level: 'error', stream: process.stderr},
|
||||
{level: 'fatal', stream: process.stderr}
|
||||
]
|
||||
|
||||
const logger = pino({
|
||||
name: 'my-app',
|
||||
level: 'debug', // must be the lowest level of all streams
|
||||
}, pino.multistream(streams))
|
||||
```
|
||||
|
||||
<a id="dupe-keys"></a>
|
||||
## How Pino handles duplicate keys
|
||||
|
||||
Duplicate keys are possibly when a child logger logs an object with a key that
|
||||
collides with a key in the child loggers bindings.
|
||||
|
||||
See the [child logger duplicate keys caveat](/docs/child-loggers.md#duplicate-keys-caveat)
|
||||
for information on this is handled.
|
||||
|
||||
<a id="level-string"></a>
|
||||
## Log levels as labels instead of numbers
|
||||
Pino log lines are meant to be parsable. Thus, Pino's default mode of operation
|
||||
is to print the level value instead of the string name.
|
||||
However, you can use the [`formatters`](/docs/api.md#formatters-object) option
|
||||
with a [`level`](/docs/api.md#level) function to print the string name instead of the level value :
|
||||
|
||||
```js
|
||||
const pino = require('pino')
|
||||
|
||||
const log = pino({
|
||||
formatters: {
|
||||
level: (label) => {
|
||||
return {
|
||||
level: label
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
log.info('message')
|
||||
|
||||
// {"level":"info","time":1661632832200,"pid":18188,"hostname":"foo","msg":"message"}
|
||||
```
|
||||
|
||||
Although it works, we recommend using one of these options instead if you are able:
|
||||
|
||||
1. If the only change desired is the name then a transport can be used. One such
|
||||
transport is [`pino-text-level-transport`](https://npm.im/pino-text-level-transport).
|
||||
1. Use a prettifier like [`pino-pretty`](https://npm.im/pino-pretty) to make
|
||||
the logs human friendly.
|
||||
|
||||
<a id="debug"></a>
|
||||
## Pino with `debug`
|
||||
|
||||
The popular [`debug`](https://npm.im/debug) is used in many modules across the ecosystem.
|
||||
|
||||
The [`pino-debug`](https://github.com/pinojs/pino-debug) module
|
||||
can capture calls to `debug` loggers and run them
|
||||
through `pino` instead. This results in a 10x (20x in asynchronous mode)
|
||||
performance improvement - even though `pino-debug` is logging additional
|
||||
data and wrapping it in JSON.
|
||||
|
||||
To quickly enable this install [`pino-debug`](https://github.com/pinojs/pino-debug)
|
||||
and preload it with the `-r` flag, enabling any `debug` logs with the
|
||||
`DEBUG` environment variable:
|
||||
|
||||
```sh
|
||||
$ npm i pino-debug
|
||||
$ DEBUG=* node -r pino-debug app.js
|
||||
```
|
||||
|
||||
[`pino-debug`](https://github.com/pinojs/pino-debug) also offers fine-grain control to map specific `debug`
|
||||
namespaces to `pino` log levels. See [`pino-debug`](https://github.com/pinojs/pino-debug)
|
||||
for more.
|
||||
|
||||
<a id="windows"></a>
|
||||
## Unicode and Windows terminal
|
||||
|
||||
Pino uses [sonic-boom](https://github.com/mcollina/sonic-boom) to speed
|
||||
up logging. Internally, it uses [`fs.write`](https://nodejs.org/dist/latest-v10.x/docs/api/fs.html#fs_fs_write_fd_string_position_encoding_callback) to write log lines directly to a file
|
||||
descriptor. On Windows, Unicode output is not handled properly in the
|
||||
terminal (both `cmd.exe` and PowerShell), and as such the output could
|
||||
be visualized incorrectly if the log lines include utf8 characters. It
|
||||
is possible to configure the terminal to visualize those characters
|
||||
correctly with the use of [`chcp`](https://ss64.com/nt/chcp.html) by
|
||||
executing in the terminal `chcp 65001`. This is a known limitation of
|
||||
Node.js.
|
||||
|
||||
<a id="stackdriver"></a>
|
||||
## Mapping Pino Log Levels to Google Cloud Logging (Stackdriver) Severity Levels
|
||||
|
||||
Google Cloud Logging uses `severity` levels instead of log levels. As a result, all logs may show as INFO
|
||||
level logs while completely ignoring the level set in the pino log. Google Cloud Logging also prefers that
|
||||
log data is present inside a `message` key instead of the default `msg` key that Pino uses. Use a technique
|
||||
similar to the one below to retain log levels in Google Cloud Logging
|
||||
|
||||
```js
|
||||
const pino = require('pino')
|
||||
|
||||
// https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#logseverity
|
||||
const PinoLevelToSeverityLookup = {
|
||||
trace: 'DEBUG',
|
||||
debug: 'DEBUG',
|
||||
info: 'INFO',
|
||||
warn: 'WARNING',
|
||||
error: 'ERROR',
|
||||
fatal: 'CRITICAL',
|
||||
};
|
||||
|
||||
const defaultPinoConf = {
|
||||
messageKey: 'message',
|
||||
formatters: {
|
||||
level(label, number) {
|
||||
return {
|
||||
severity: PinoLevelToSeverityLookup[label] || PinoLevelToSeverityLookup['info'],
|
||||
level: number,
|
||||
}
|
||||
}
|
||||
},
|
||||
}
|
||||
|
||||
module.exports = function createLogger(options) {
|
||||
return pino(Object.assign({}, options, defaultPinoConf))
|
||||
}
|
||||
```
|
||||
|
||||
A library that configures Pino for
|
||||
[Google Cloud Structured Logging](https://cloud.google.com/logging/docs/structured-logging)
|
||||
is available at:
|
||||
[@google-cloud/pino-logging-gcp-config](https://www.npmjs.com/package/@google-cloud/pino-logging-gcp-config)
|
||||
|
||||
This library has the following features:
|
||||
|
||||
+ Converts Pino log levels to Google Cloud Logging log levels, as above
|
||||
+ Uses `message` instead of `msg` for the message key, as above
|
||||
+ Adds a millisecond-granularity timestamp in the
|
||||
[structure](https://cloud.google.com/logging/docs/agent/logging/configuration#timestamp-processing)
|
||||
recognised by Google Cloud Logging eg: \
|
||||
`"timestamp":{"seconds":1445470140,"nanos":123000000}`
|
||||
+ Adds a sequential
|
||||
[`insertId`](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#FIELDS.insert_id)
|
||||
to ensure log messages with identical timestamps are ordered correctly.
|
||||
+ Logs including an `Error` object have the
|
||||
[`stack_trace`](https://cloud.google.com/error-reporting/docs/formatting-error-messages#log-error)
|
||||
property set so that the error is forwarded to Google Cloud Error Reporting.
|
||||
+ Includes a
|
||||
[`ServiceContext`](https://cloud.google.com/error-reporting/reference/rest/v1beta1/ServiceContext)
|
||||
object in the logs for Google Cloud Error Reporting, auto detected from the
|
||||
environment if not specified
|
||||
+ Maps the OpenTelemetry properties `span_id`, `trace_id`, and `trace_flags`
|
||||
to the equivalent Google Cloud Logging fields.
|
||||
|
||||
<a id="grafana-loki"></a>
|
||||
## Using Grafana Loki to evaluate pino logs in a kubernetes cluster
|
||||
|
||||
To get pino logs into Grafana Loki there are two options:
|
||||
|
||||
1. **Push:** Use [pino-loki](https://github.com/Julien-R44/pino-loki) to send logs directly to Loki.
|
||||
1. **Pull:** Configure Grafana Promtail to read and properly parse the logs before sending them to Loki.
|
||||
Similar to Google Cloud logging, this involves remapping the log levels. See this [article](https://medium.com/@janpaepke/structured-logging-in-the-grafana-monitoring-stack-8aff0a5af2f5) for details.
|
||||
|
||||
<a id="avoid-message-conflict"></a>
|
||||
## Avoid Message Conflict
|
||||
|
||||
As described in the [`message` documentation](/docs/api.md#message), when a log
|
||||
is written like `log.info({ msg: 'a message' }, 'another message')` then the
|
||||
final output JSON will have `"msg":"another message"` and the `'a message'`
|
||||
string will be lost. To overcome this, the [`logMethod` hook](/docs/api.md#logmethod)
|
||||
can be used:
|
||||
|
||||
```js
|
||||
'use strict'
|
||||
|
||||
const log = require('pino')({
|
||||
level: 'debug',
|
||||
hooks: {
|
||||
logMethod (inputArgs, method) {
|
||||
if (inputArgs.length === 2 && inputArgs[0].msg) {
|
||||
inputArgs[0].originalMsg = inputArgs[0].msg
|
||||
}
|
||||
return method.apply(this, inputArgs)
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
log.info('no original message')
|
||||
log.info({ msg: 'mapped to originalMsg' }, 'a message')
|
||||
|
||||
// {"level":30,"time":1596313323106,"pid":63739,"hostname":"foo","msg":"no original message"}
|
||||
// {"level":30,"time":1596313323107,"pid":63739,"hostname":"foo","msg":"a message","originalMsg":"mapped to originalMsg"}
|
||||
```
|
||||
|
||||
<a id="best-performance-for-stdout"></a>
|
||||
## Best performance for logging to `stdout`
|
||||
|
||||
The best performance for logging directly to stdout is _usually_ achieved by using the
|
||||
default configuration:
|
||||
|
||||
```js
|
||||
const log = require('pino')();
|
||||
```
|
||||
|
||||
You should only have to configure custom transports or other settings
|
||||
if you have broader logging requirements.
|
||||
|
||||
<a id="testing"></a>
|
||||
## Testing
|
||||
|
||||
See [`pino-test`](https://github.com/pinojs/pino-test).
|
||||
64
node_modules/pino/docs/lts.md
generated
vendored
Normal file
64
node_modules/pino/docs/lts.md
generated
vendored
Normal file
@@ -0,0 +1,64 @@
|
||||
## Long Term Support
|
||||
|
||||
Pino's Long Term Support (LTS) is provided according to the schedule laid
|
||||
out in this document:
|
||||
|
||||
1. Major releases, "X" release of [semantic versioning][semver] X.Y.Z release
|
||||
versions, are supported for a minimum period of six months from their release
|
||||
date. The release date of any specific version can be found at
|
||||
[https://github.com/pinojs/pino/releases](https://github.com/pinojs/pino/releases).
|
||||
|
||||
1. Major releases will receive security updates for an additional six months
|
||||
from the release of the next major release. After this period
|
||||
we will still review and release security fixes as long as they are
|
||||
provided by the community and they do not violate other constraints,
|
||||
e.g. minimum supported Node.js version.
|
||||
|
||||
1. Major releases will be tested and verified against all Node.js
|
||||
release lines that are supported by the
|
||||
[Node.js LTS policy](https://github.com/nodejs/Release) within the
|
||||
LTS period of that given Pino release line. This implies that only
|
||||
the latest Node.js release of a given line is supported.
|
||||
|
||||
A "month" is defined as 30 consecutive days.
|
||||
|
||||
> ## Security Releases and Semver
|
||||
>
|
||||
> As a consequence of providing long-term support for major releases, there
|
||||
> are occasions where we need to release breaking changes as a _minor_
|
||||
> version release. Such changes will _always_ be noted in the
|
||||
> [release notes](https://github.com/pinojs/pino/releases).
|
||||
>
|
||||
> To avoid automatically receiving breaking security updates it is possible to use
|
||||
> the tilde (`~`) range qualifier. For example, to get patches for the 6.1
|
||||
> release, and avoid automatically updating to the 6.1 release, specify
|
||||
> the dependency as `"pino": "~6.1.x"`. This will leave your application vulnerable,
|
||||
> so please use with caution.
|
||||
|
||||
[semver]: https://semver.org/
|
||||
|
||||
<a name="lts-schedule"></a>
|
||||
|
||||
### Schedule
|
||||
|
||||
| Version | Release Date | End Of LTS Date | Node.js |
|
||||
| :------ | :----------- | :-------------- | :------------------- |
|
||||
| 9.x | 2024-04-26 | TBD | 18, 20, 22 |
|
||||
| 8.x | 2022-06-01 | 2024-10-26 | 14, 16, 18, 20 |
|
||||
| 7.x | 2021-10-14 | 2023-06-01 | 12, 14, 16 |
|
||||
| 6.x | 2020-03-07 | 2022-04-14 | 10, 12, 14, 16 |
|
||||
|
||||
<a name="supported-os"></a>
|
||||
|
||||
### CI tested operating systems
|
||||
|
||||
Pino uses GitHub Actions for CI testing, please refer to
|
||||
[GitHub's documentation regarding workflow runners](https://docs.github.com/en/actions/using-github-hosted-runners/about-github-hosted-runners#supported-runners-and-hardware-resources)
|
||||
for further details on what the latest virtual environment is in relation to
|
||||
the YAML workflow labels below:
|
||||
|
||||
| OS | YAML Workflow Label | Node.js |
|
||||
|---------|------------------------|--------------|
|
||||
| Linux | `ubuntu-latest` | 18, 20, 22 |
|
||||
| Windows | `windows-latest` | 18, 20, 22 |
|
||||
| MacOS | `macos-latest` | 18, 20, 22 |
|
||||
35
node_modules/pino/docs/pretty.md
generated
vendored
Normal file
35
node_modules/pino/docs/pretty.md
generated
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
# Pretty Printing
|
||||
|
||||
By default, Pino log lines are newline delimited JSON (NDJSON). This is perfect
|
||||
for production usage and long-term storage. It's not so great for development
|
||||
environments. Thus, Pino logs can be prettified by using a Pino prettifier
|
||||
module like [`pino-pretty`][pp]:
|
||||
|
||||
1. Install a prettifier module as a separate dependency, e.g. `npm install pino-pretty`.
|
||||
2. Instantiate the logger with the `transport.target` option set to `'pino-pretty'`:
|
||||
```js
|
||||
const pino = require('pino')
|
||||
const logger = pino({
|
||||
transport: {
|
||||
target: 'pino-pretty'
|
||||
},
|
||||
})
|
||||
|
||||
logger.info('hi')
|
||||
```
|
||||
3. The transport option can also have an options object containing `pino-pretty` options:
|
||||
```js
|
||||
const pino = require('pino')
|
||||
const logger = pino({
|
||||
transport: {
|
||||
target: 'pino-pretty',
|
||||
options: {
|
||||
colorize: true
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
logger.info('hi')
|
||||
```
|
||||
|
||||
[pp]: https://github.com/pinojs/pino-pretty
|
||||
135
node_modules/pino/docs/redaction.md
generated
vendored
Normal file
135
node_modules/pino/docs/redaction.md
generated
vendored
Normal file
@@ -0,0 +1,135 @@
|
||||
# Redaction
|
||||
|
||||
> Redaction is not supported in the browser [#670](https://github.com/pinojs/pino/issues/670)
|
||||
|
||||
To redact sensitive information, supply paths to keys that hold sensitive data
|
||||
using the `redact` option. Note that paths that contain hyphens need to use
|
||||
brackets to access the hyphenated property:
|
||||
|
||||
```js
|
||||
const logger = require('.')({
|
||||
redact: ['key', 'path.to.key', 'stuff.thats[*].secret', 'path["with-hyphen"]']
|
||||
})
|
||||
|
||||
logger.info({
|
||||
key: 'will be redacted',
|
||||
path: {
|
||||
to: {key: 'sensitive', another: 'thing'}
|
||||
},
|
||||
stuff: {
|
||||
thats: [
|
||||
{secret: 'will be redacted', logme: 'will be logged'},
|
||||
{secret: 'as will this', logme: 'as will this'}
|
||||
]
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
This will output:
|
||||
|
||||
```JSON
|
||||
{"level":30,"time":1527777350011,"pid":3186,"hostname":"Davids-MacBook-Pro-3.local","key":"[Redacted]","path":{"to":{"key":"[Redacted]","another":"thing"}},"stuff":{"thats":[{"secret":"[Redacted]","logme":"will be logged"},{"secret":"[Redacted]","logme":"as will this"}]}}
|
||||
```
|
||||
|
||||
The `redact` option can take an array (as shown in the above example) or
|
||||
an object. This allows control over *how* information is redacted.
|
||||
|
||||
For instance, setting the censor:
|
||||
|
||||
```js
|
||||
const logger = require('.')({
|
||||
redact: {
|
||||
paths: ['key', 'path.to.key', 'stuff.thats[*].secret'],
|
||||
censor: '**GDPR COMPLIANT**'
|
||||
}
|
||||
})
|
||||
|
||||
logger.info({
|
||||
key: 'will be redacted',
|
||||
path: {
|
||||
to: {key: 'sensitive', another: 'thing'}
|
||||
},
|
||||
stuff: {
|
||||
thats: [
|
||||
{secret: 'will be redacted', logme: 'will be logged'},
|
||||
{secret: 'as will this', logme: 'as will this'}
|
||||
]
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
This will output:
|
||||
|
||||
```JSON
|
||||
{"level":30,"time":1527778563934,"pid":3847,"hostname":"Davids-MacBook-Pro-3.local","key":"**GDPR COMPLIANT**","path":{"to":{"key":"**GDPR COMPLIANT**","another":"thing"}},"stuff":{"thats":[{"secret":"**GDPR COMPLIANT**","logme":"will be logged"},{"secret":"**GDPR COMPLIANT**","logme":"as will this"}]}}
|
||||
```
|
||||
|
||||
The `redact.remove` option also allows for the key and value to be removed from output:
|
||||
|
||||
```js
|
||||
const logger = require('.')({
|
||||
redact: {
|
||||
paths: ['key', 'path.to.key', 'stuff.thats[*].secret'],
|
||||
remove: true
|
||||
}
|
||||
})
|
||||
|
||||
logger.info({
|
||||
key: 'will be redacted',
|
||||
path: {
|
||||
to: {key: 'sensitive', another: 'thing'}
|
||||
},
|
||||
stuff: {
|
||||
thats: [
|
||||
{secret: 'will be redacted', logme: 'will be logged'},
|
||||
{secret: 'as will this', logme: 'as will this'}
|
||||
]
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
This will output
|
||||
|
||||
```JSON
|
||||
{"level":30,"time":1527782356751,"pid":5758,"hostname":"Davids-MacBook-Pro-3.local","path":{"to":{"another":"thing"}},"stuff":{"thats":[{"logme":"will be logged"},{"logme":"as will this"}]}}
|
||||
```
|
||||
|
||||
See [pino options in API](/docs/api.md#redact-array-object) for `redact` API details.
|
||||
|
||||
<a name="paths"></a>
|
||||
## Path Syntax
|
||||
|
||||
The syntax for paths supplied to the `redact` option conform to the syntax in path lookups
|
||||
in standard ECMAScript, with two additions:
|
||||
|
||||
* paths may start with bracket notation
|
||||
* paths may contain the asterisk `*` to denote a wildcard
|
||||
* paths are **case sensitive**
|
||||
|
||||
By way of example, the following are all valid paths:
|
||||
|
||||
* `a.b.c`
|
||||
* `a["b-c"].d`
|
||||
* `["a-b"].c`
|
||||
* `a.b.*`
|
||||
* `a[*].b`
|
||||
|
||||
## Overhead
|
||||
|
||||
Pino's redaction functionality is built on top of [`fast-redact`](https://github.com/davidmarkclements/fast-redact)
|
||||
which adds about 2% overhead to `JSON.stringify` when using paths without wildcards.
|
||||
|
||||
When used with pino logger with a single redacted path, any overhead is within noise -
|
||||
a way to deterministically measure its effect has not been found. This is because it is not a bottleneck.
|
||||
|
||||
However, wildcard redaction does carry a non-trivial cost relative to explicitly declaring the keys
|
||||
(50% in a case where four keys are redacted across two objects). See
|
||||
the [`fast-redact` benchmarks](https://github.com/davidmarkclements/fast-redact#benchmarks) for details.
|
||||
|
||||
## Safety
|
||||
|
||||
The `redact` option is intended as an initialization time configuration option.
|
||||
Path strings must not originate from user input.
|
||||
The `fast-redact` module uses a VM context to syntax check the paths, user input
|
||||
should never be combined with such an approach. See the [`fast-redact` Caveat](https://github.com/davidmarkclements/fast-redact#caveat)
|
||||
and the [`fast-redact` Approach](https://github.com/davidmarkclements/fast-redact#approach) for in-depth information.
|
||||
1263
node_modules/pino/docs/transports.md
generated
vendored
Normal file
1263
node_modules/pino/docs/transports.md
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
309
node_modules/pino/docs/web.md
generated
vendored
Normal file
309
node_modules/pino/docs/web.md
generated
vendored
Normal file
@@ -0,0 +1,309 @@
|
||||
# Web Frameworks
|
||||
|
||||
Since HTTP logging is a primary use case, Pino has first-class support for the Node.js
|
||||
web framework ecosystem.
|
||||
|
||||
- [Web Frameworks](#web-frameworks)
|
||||
- [Pino with Fastify](#pino-with-fastify)
|
||||
- [Pino with Express](#pino-with-express)
|
||||
- [Pino with Hapi](#pino-with-hapi)
|
||||
- [Pino with Restify](#pino-with-restify)
|
||||
- [Pino with Koa](#pino-with-koa)
|
||||
- [Pino with Node core `http`](#pino-with-node-core-http)
|
||||
- [Pino with Nest](#pino-with-nest)
|
||||
- [Pino with H3](#pino-with-h3)
|
||||
- [Pino with Hono](#pino-with-hono)
|
||||
|
||||
<a id="fastify"></a>
|
||||
## Pino with Fastify
|
||||
|
||||
The Fastify web framework comes bundled with Pino by default, simply set Fastify's
|
||||
`logger` option to `true` and use `request.log` or `reply.log` for log messages that correspond
|
||||
to each request:
|
||||
|
||||
```js
|
||||
const fastify = require('fastify')({
|
||||
logger: true
|
||||
})
|
||||
|
||||
fastify.get('/', async (request, reply) => {
|
||||
request.log.info('something')
|
||||
return { hello: 'world' }
|
||||
})
|
||||
|
||||
fastify.listen({ port: 3000 }, (err) => {
|
||||
if (err) {
|
||||
fastify.log.error(err)
|
||||
process.exit(1)
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
The `logger` option can also be set to an object, which will be passed through directly
|
||||
as the [`pino` options object](/docs/api.md#options-object).
|
||||
|
||||
See the [fastify documentation](https://www.fastify.io/docs/latest/Reference/Logging/) for more information.
|
||||
|
||||
<a id="express"></a>
|
||||
## Pino with Express
|
||||
|
||||
```sh
|
||||
npm install pino-http
|
||||
```
|
||||
|
||||
```js
|
||||
const app = require('express')()
|
||||
const pino = require('pino-http')()
|
||||
|
||||
app.use(pino)
|
||||
|
||||
app.get('/', function (req, res) {
|
||||
req.log.info('something')
|
||||
res.send('hello world')
|
||||
})
|
||||
|
||||
app.listen(3000)
|
||||
```
|
||||
|
||||
See the [pino-http README](https://npm.im/pino-http) for more info.
|
||||
|
||||
<a id="hapi"></a>
|
||||
## Pino with Hapi
|
||||
|
||||
```sh
|
||||
npm install hapi-pino
|
||||
```
|
||||
|
||||
```js
|
||||
'use strict'
|
||||
|
||||
const Hapi = require('@hapi/hapi')
|
||||
const Pino = require('hapi-pino');
|
||||
|
||||
async function start () {
|
||||
// Create a server with a host and port
|
||||
const server = Hapi.server({
|
||||
host: 'localhost',
|
||||
port: 3000
|
||||
})
|
||||
|
||||
// Add the route
|
||||
server.route({
|
||||
method: 'GET',
|
||||
path: '/',
|
||||
handler: async function (request, h) {
|
||||
// request.log is HAPI's standard way of logging
|
||||
request.log(['a', 'b'], 'Request into hello world')
|
||||
|
||||
// a pino instance can also be used, which will be faster
|
||||
request.logger.info('In handler %s', request.path)
|
||||
|
||||
return 'hello world'
|
||||
}
|
||||
})
|
||||
|
||||
await server.register(Pino)
|
||||
|
||||
// also as a decorated API
|
||||
server.logger.info('another way for accessing it')
|
||||
|
||||
// and through Hapi standard logging system
|
||||
server.log(['subsystem'], 'third way for accessing it')
|
||||
|
||||
await server.start()
|
||||
|
||||
return server
|
||||
}
|
||||
|
||||
start().catch((err) => {
|
||||
console.log(err)
|
||||
process.exit(1)
|
||||
})
|
||||
```
|
||||
|
||||
See the [hapi-pino README](https://npm.im/hapi-pino) for more info.
|
||||
|
||||
<a id="restify"></a>
|
||||
## Pino with Restify
|
||||
|
||||
```sh
|
||||
npm install restify-pino-logger
|
||||
```
|
||||
|
||||
```js
|
||||
const server = require('restify').createServer({name: 'server'})
|
||||
const pino = require('restify-pino-logger')()
|
||||
|
||||
server.use(pino)
|
||||
|
||||
server.get('/', function (req, res) {
|
||||
req.log.info('something')
|
||||
res.send('hello world')
|
||||
})
|
||||
|
||||
server.listen(3000)
|
||||
```
|
||||
|
||||
See the [restify-pino-logger README](https://npm.im/restify-pino-logger) for more info.
|
||||
|
||||
<a id="koa"></a>
|
||||
## Pino with Koa
|
||||
|
||||
```sh
|
||||
npm install koa-pino-logger
|
||||
```
|
||||
|
||||
```js
|
||||
const Koa = require('koa')
|
||||
const app = new Koa()
|
||||
const pino = require('koa-pino-logger')()
|
||||
|
||||
app.use(pino)
|
||||
|
||||
app.use((ctx) => {
|
||||
ctx.log.info('something else')
|
||||
ctx.body = 'hello world'
|
||||
})
|
||||
|
||||
app.listen(3000)
|
||||
```
|
||||
|
||||
See the [koa-pino-logger README](https://github.com/pinojs/koa-pino-logger) for more info.
|
||||
|
||||
<a id="http"></a>
|
||||
## Pino with Node core `http`
|
||||
|
||||
```sh
|
||||
npm install pino-http
|
||||
```
|
||||
|
||||
```js
|
||||
const http = require('http')
|
||||
const server = http.createServer(handle)
|
||||
const logger = require('pino-http')()
|
||||
|
||||
function handle (req, res) {
|
||||
logger(req, res)
|
||||
req.log.info('something else')
|
||||
res.end('hello world')
|
||||
}
|
||||
|
||||
server.listen(3000)
|
||||
```
|
||||
|
||||
See the [pino-http README](https://npm.im/pino-http) for more info.
|
||||
|
||||
|
||||
<a id="nest"></a>
|
||||
## Pino with Nest
|
||||
|
||||
```sh
|
||||
npm install nestjs-pino
|
||||
```
|
||||
|
||||
```ts
|
||||
import { NestFactory } from '@nestjs/core'
|
||||
import { Controller, Get, Module } from '@nestjs/common'
|
||||
import { LoggerModule, Logger } from 'nestjs-pino'
|
||||
|
||||
@Controller()
|
||||
export class AppController {
|
||||
constructor(private readonly logger: Logger) {}
|
||||
|
||||
@Get()
|
||||
getHello() {
|
||||
this.logger.log('something')
|
||||
return `Hello world`
|
||||
}
|
||||
}
|
||||
|
||||
@Module({
|
||||
controllers: [AppController],
|
||||
imports: [LoggerModule.forRoot()]
|
||||
})
|
||||
class MyModule {}
|
||||
|
||||
async function bootstrap() {
|
||||
const app = await NestFactory.create(MyModule)
|
||||
await app.listen(3000)
|
||||
}
|
||||
bootstrap()
|
||||
```
|
||||
|
||||
See the [nestjs-pino README](https://npm.im/nestjs-pino) for more info.
|
||||
|
||||
|
||||
<a id="h3"></a>
|
||||
## Pino with H3
|
||||
|
||||
```sh
|
||||
npm install pino-http h3
|
||||
```
|
||||
|
||||
Save as `server.mjs`:
|
||||
|
||||
```js
|
||||
import { createApp, createRouter, eventHandler, fromNodeMiddleware } from "h3";
|
||||
import pino from 'pino-http'
|
||||
|
||||
export const app = createApp();
|
||||
|
||||
const router = createRouter();
|
||||
app.use(router);
|
||||
app.use(fromNodeMiddleware(pino()))
|
||||
|
||||
app.use(eventHandler((event) => {
|
||||
event.node.req.log.info('something')
|
||||
return 'hello world'
|
||||
}))
|
||||
|
||||
router.get(
|
||||
"/",
|
||||
eventHandler((event) => {
|
||||
return { path: event.path, message: "Hello World!" };
|
||||
}),
|
||||
);
|
||||
```
|
||||
|
||||
Execute `npx --yes listhen -w --open ./server.mjs`.
|
||||
|
||||
See the [pino-http README](https://npm.im/pino-http) for more info.
|
||||
|
||||
|
||||
<a id="hono"></a>
|
||||
## Pino with Hono
|
||||
|
||||
```sh
|
||||
npm install pino pino-http hono
|
||||
```
|
||||
|
||||
```js
|
||||
import { serve } from '@hono/node-server';
|
||||
import { Hono } from 'hono';
|
||||
import { requestId } from 'hono/request-id';
|
||||
import { pinoHttp } from 'pino-http';
|
||||
|
||||
const app = new Hono();
|
||||
app.use(requestId());
|
||||
app.use(async (c, next) => {
|
||||
// pass hono's request-id to pino-http
|
||||
c.env.incoming.id = c.var.requestId;
|
||||
|
||||
// map express style middleware to hono
|
||||
await new Promise((resolve) => pinoHttp()(c.env.incoming, c.env.outgoing, () => resolve()));
|
||||
|
||||
c.set('logger', c.env.incoming.log);
|
||||
|
||||
await next();
|
||||
});
|
||||
|
||||
app.get('/', (c) => {
|
||||
c.var.logger.info('something');
|
||||
|
||||
return c.text('Hello Node.js!');
|
||||
});
|
||||
|
||||
serve(app);
|
||||
```
|
||||
|
||||
See the [pino-http README](https://npm.im/pino-http) for more info.
|
||||
Reference in New Issue
Block a user