From 49cdad04bc15ffd677f5475bdf63c969c0bdb02e Mon Sep 17 00:00:00 2001 From: stayseesong Date: Fri, 16 Dec 2022 15:46:07 -0800 Subject: [PATCH 01/15] Analytics Node Next --- .../catalog/libraries/server/node/next.md | 616 ++++++++++++++++++ 1 file changed, 616 insertions(+) create mode 100644 src/connections/sources/catalog/libraries/server/node/next.md diff --git a/src/connections/sources/catalog/libraries/server/node/next.md b/src/connections/sources/catalog/libraries/server/node/next.md new file mode 100644 index 0000000000..2be6e66c77 --- /dev/null +++ b/src/connections/sources/catalog/libraries/server/node/next.md @@ -0,0 +1,616 @@ +--- +title: Analytics for Node.js Next +repo: analytics-node +strat: node-js +--- + +Segment's Analytics Node.js Next library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. + +The [Segment Analytics Node.js Next library is open-source](https://github.com/segmentio/analytics-next){:target="_blank"} on GitHub. + +All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to Segment's servers. + +## Getting Started + +> warning "" +> Make sure you're using a version of Node that's 14 or higher. + +Run: + +```bash +# npm +npm install @segment/analytics-node +# yarn +yarn add @segment/analytics-node +# pnpm +pnpm install @segment/analytics-node +``` + +This will add Segment's Node library module to your `package.json`. The module exposes an `Analytics` constructor, which you need to initialize with your Segment source's **Write Key**, like so: + +```javascript +var Analytics = require('analytics-node'); +var analytics = new Analytics('YOUR_WRITE_KEY'); +``` + +Be sure to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can find in Segment under your source settings. + +This will create an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node/#development). + +### Regional configuration +For Business plans with access to [Regional Segment](/docs/guides/regional-segment), you can use the `host` configuration parameter to send data to the desired region: +1. Oregon (Default) — `api.segment.io/v1` +2. Dublin — `events.eu1.segmentapis.com` + +An example of setting the host to the EU endpoint using the Node library is: +```javascript +const analytics = new Analytics({ + ... + host: "https://events.eu1.segmentapis.com" +}); +``` + +## Basic tracking methods +The basic tracking methods below serve as the building blocks of your Segment tracking. They include [Identify](#identify), [Track](#track), [Page](#page), [Group](#group), and [Alias](#alias). + +These methods correspond with those used in the [Segment Spec](/docs/connections/spec/). The documentation on this page explains how to use these methods in Analytics Node.js Next. + + +### Identify + +> info "Good to know" +> For any of the different methods described on this page, you can replace the properties and traits in the code samples with variables that represent the data collected. + +`identify` lets you tie a user to their actions and record traits about them. It includes a unique User ID and/or anonymous ID, and any optional traits you know about them. + +You should call `identify` once when the user's account is first created, and then again any time their traits change. + +Example of an anonymous `identify` call: + +```javascript +analytics.identify({ + anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe', + traits: { + friends: 42 + } +}); +``` + +This call identifies the user and records their unique anonymous ID, and labels them with the `friends` trait. + +Example of an `identify` call for an identified user: + +```javascript +analytics.identify({ + userId: '019mr8mf4r', + traits: { + name: 'Michael Bolton', + email: 'mbolton@example.com', + plan: 'Enterprise', + friends: 42 + } +}); +``` +The call above identifies Michael by his unique User ID (the one you know him by in your database), and labels him with the `name`, `email`, `plan` and `friends` traits. + +The `identify` call has the following fields: + + + + + + + + + + + + + + + + + + + + + + +
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any identify call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all identify calls._
`traits` _Object, optional_A dictionary of [traits](/docs/connections/spec/identify#traits) you know about the user. Things like: `email`, `name` or `friends`.
`timestamp` _Date, optional_A JavaScript date object representing when the identify took place. If the identify just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
+ +Find details on the **identify method payload** in our [Spec](/docs/connections/spec/identify/). + +### Track + +`track` lets you record the actions your users perform. Every action triggers what we call an "event", which can also have associated properties. + +You'll want to track events that are indicators of success for your site, like **Signed Up**, **Item Purchased** or **Article Bookmarked**. + +To get started, we recommend tracking just a few important events. You can always add more later! + +Example anonymous `track` call: + +```javascript +analytics.track({ + anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe', + event: 'Item Purchased', + properties: { + revenue: 39.95, + shippingMethod: '2-day' + } +}); +``` + +Example identified `track` call: + +```javascript +analytics.track({ + userId: '019mr8mf4r', + event: 'Item Purchased', + properties: { + revenue: 39.95, + shippingMethod: '2-day' + } +}); +``` + +This example `track` call tells us that your user just triggered the **Item Purchased** event with a revenue of $39.95 and chose your hypothetical '2-day' shipping. + +`track` event properties can be anything you want to record. In this case, revenue and shipping method. + +The `track` call has the following fields: + + + + + + + + + + + + + + + + + + + + + + + + + + +
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any track call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all track calls._
`event` _String_The name of the event you're tracking. We recommend human-readable names like `Song Played` or `Status Updated`.
`properties` _Object, optional_A dictionary of properties for the event. If the event was `Product Added`, it might have properties like `price` or `product`.
`timestamp` _Date, optional_A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
+ +Find details on **best practices in event naming** as well as the **`track` method payload** in our [Spec](/docs/connections/spec/track/). + +### Page + +The [`page`](/docs/connections/spec/page/) method lets you record page views on your website, along with optional extra information about the page being viewed. + +If you're using our client-side set up in combination with the Node.js library, page calls are **already tracked for you** by default. However, if you want to record your own page views manually and aren't using our client-side library, read on! + +Example `page` call: + +```js +analytics.page({ + userId: '019mr8mf4r', + category: 'Docs', + name: 'Node.js Library', + properties: { + url: 'https://segment.com/docs/connections/sources/catalog/librariesnode', + path: '/docs/connections/sources/catalog/librariesnode/', + title: 'Node.js Library - Segment', + referrer: 'https://github.com/segmentio/analytics-node' + } +}); +``` + +The `page` call has the following fields: + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any page call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any page call._
`category` _String, optional_The category of the page. Useful for things like ecommerce where many pages often live under a larger category.
`name` _String, optional_The name of the page, for example **Signup** or **Home**.
`properties` _Object, optional_A dictionary of properties of the page. A few properties specially recognized and automatically translated: `url`, `title`, `referrer` and `path`, but you can add your own too!
`timestamp` _Date, optional_A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
+ +Find details on the **`page` payload** in our [Spec](/docs/connections/spec/page/). + +### Group + +`group` lets you associate an [identified user](/docs/connections/sources/catalog/libraries/server/node/#identify) with a group. A group could be a company, organization, account, project or team! It also lets you record custom traits about the group, like industry or number of employees. + +This is useful for tools like [Intercom](/docs/connections/destinations/catalog/intercom/), [Preact](/docs/connections/destinations/catalog/preact/) and [Totango](/docs/connections/destinations/catalog/totango/), as it ties the user to a **group** of other users. + +Example `group` call: + +```javascript +analytics.group({ + userId: '019mr8mf4r', + groupId: '56', + traits: { + name: 'Initech', + description: 'Accounting Software' + } +}); +``` + +The `group` call has the following fields: + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any group call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any group call._
`groupId` _stringThe ID of the group.
`traits` _dict, optional_A dict of traits you know about the group. For a company, they might be things like `name`, `address`, or `phone`.
`context` _dict, optional_A dict containing any context about the request. To see the full reference of supported keys, check them out in the [context reference](/docs/connections/spec/common/#context)
`timestamp` _datetime, optional_A `datetime` object representing when the group took place. If the group just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you send `timestamp`.
`integrations` _dict, optional_A dictionary of destinations to enable or disable
+ +Find more details about `group`, including the **`group` payload**, in our [Spec](/docs/connections/spec/group/). + +### Alias + +The `alias` call allows you to associate one identity with another. This is an advanced method and should not be widely used, but is required to manage user identities in _some_ destinations. Other destinations do not support the alias call. + +In [Mixpanel](/docs/connections/destinations/catalog/mixpanel/#alias) it's used to associate an anonymous user with an identified user once they sign up. For [Kissmetrics](/docs/connections/destinations/catalog/kissmetrics/#alias), if your user switches IDs, you can use 'alias' to rename the 'userId'. + +Example `alias` call: + +```javascript +analytics.alias({ + previousId: 'old_id', + userId: 'new_id' +}); +``` + +The `alias` call has the following fields: + + + + + + + + + + +
`userId` _String_The ID for this user in your database.
`previousId` _String_The previous ID to alias from.
+ +Here's a full example of how Segment might use the `alias` call: + +```javascript +// the anonymous user does actions ... +analytics.track({ userId: 'anonymous_user', event: 'Anonymous Event' }) +// the anonymous user signs up and is aliased +analytics.alias({ previousId: 'anonymous_user', userId: 'identified@example.com' }) +// the identified user is identified +analytics.identify({ userId: 'identified@example.com', traits: { plan: 'Free' } }) +// the identified user does actions ... +analytics.track({ userId: 'identified@example.com', event: 'Identified Action' }) +``` + +For more details about `alias`, including the **`alias` call payload**, check out the Segment [Spec](/docs/connections/spec/alias/). + +--- + + +## Configuration + +The second argument to the `Analytics` constructor is an optional list of settings to configure the module. + +```javascript +const analytics = new Analytics({ + writeKey: '', + host: 'https://api.segment.io', + path: '/v1/batch', + maxRetries: 3, + maxEventsInBatch: 15, + flushInterval: 10000, + // ... and more! + }) +``` + +Setting | Details +------- | -------- +`writeKey` _string_ | The key that corresponds to your Segment.io project +`host` _string_ | The base URL of the API. The default is: "https://api.segment.io" +`path` _string_ | The API path route. The default is: "/v1/batch" +`maxRetries` _number_ | The number of times to retry flushing a batch. The default is: `3` +`maxEventsInBatch` _number_ | The number of messages to enqueue before flushing. The default is: `15` +`flushInterval` _number_ | The number of milliseconds to wait before flushing the queue automatically. The default is: `10000` +`httpRequestTimeout` | The maximum number of milliseconds to wait for an http request. The default is: `10000` + +### Graceful shutdown +Avoid losing events after shutting down your console. Call `.closeAndFlush()` to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved. + +```javascript +await analytics.closeAndFlush() +// or +await analytics.closeAndFlush({ timeout: 5000 }) // force resolve after 5000ms +``` + +Here's an example of how to use graceful shutdown: +```javascript +const app = express() +const server = app.listen(3000) + +const onExit = async () => { + await analytics.closeAndFlush() + server.close(() => { + console.log("Gracefully closing server...") + process.exit() + }) +} +['SIGINT', 'SIGTERM'].forEach((code) => process.on(code, onExit)) +``` + +### Collect unflushed events +If you need to preserve all of your events in the instance of a forced timeout, even ones that came in after analytics.closeAndFlush() was called, you can still collect those events by using: + +```javascript +const unflushedEvents = [] + +analytics.on('call_after_close', (event) => unflushedEvents.push(events)) +await analytics.closeAndFlush() + +console.log(unflushedEvents) // all events that came in after closeAndFlush was called +``` + +## Error handling + +To keep track of errors, subscribe and log all event delivery errors by running: + +```javascript +const analytics = new Analytics({ writeKey: '' }) + +analytics.on('error', (err) => console.error(err)) +``` + + +### Event emitter interface +The event emitter interface allows you to track when certain things happen in the app, such as a track call or an error, and it will call the function you provided with some arguments when that event happens. + +```javascript +analytics.on('error', (err) => console.error(err)) + +analytics.on('identify', (ctx) => console.log(err)) + +analytics.on('track', (ctx) => console.log(ctx)) +``` + + +## Development + +You can use this initialization during development to make the library flush every time a message is submitted, so that you can be sure your calls are working properly before pushing to production. + +```javascript +var analytics = new Analytics('YOUR_WRITE_KEY', { flushAt: 1 }); +``` + + +## Selecting Destinations + +The `alias`, `group`, `identify`, `page` and `track` calls can all be passed an object of `integrations` that lets you turn certain destinations on or off. By default all destinations are enabled. + +Here's an example with the `integrations` object shown: + +```javascript +analytics.track({ + event: 'Membership Upgraded', + userId: '97234974', + integrations: { + 'All': false, + 'Vero': true, + 'Google Analytics': false + } +}) +``` + +In this case, Segment specifies that they want this `track` to only go to Vero. `All: false` says that no destination should be enabled unless otherwise specified. `Vero: true` turns on Vero. + +Destination flags are **case sensitive** and match [the destination's name in the docs](/docs/connections/destinations/) (for example, "AdLearn Open Platform", "awe.sm", "MailChimp"). In some cases, there may be several names for a destination; if that happens you'll see a "Adding (destination name) to the Integrations Object" section in the destination's doc page with a list of valid names. + +**Note:** + +- Available at the business level, filtering track calls can be done right from the Segment UI on your source schema page. We recommend using the UI if possible since it's a much simpler way of managing your filters and can be updated with no code changes on your side. + +- If you are on a grandfathered plan, events sent server-side that are filtered through the Segment dashboard will still count towards your API usage. + +## Historical Import + +You can import historical data by adding the `timestamp` argument to any of your method calls. This can be helpful if you've just switched to Segment. + +Historical imports can only be done into destinations that can accept historical timestamped data. Most analytics tools like Mixpanel, Amplitude, Kissmetrics, etc. can handle that type of data just fine. One common destination that does not accept historical data is Google Analytics since their API cannot accept historical data. + +**Note:** If you're tracking things that are happening right now, leave out the `timestamp` and Segment's servers will timestamp the requests for you. + + +## Batching + +Segment's libraries are built to support high performance environments. That means it is safe to use Segment's Node library on a web server that's serving hundreds of requests per second. + +Every method you call **doesn't** result in a HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation. + +By default, Segment's library will flush: + + - The very first time it gets a message. + - Every 15 messages (controlled by `settings.maxEventsInBatch`). + - If 10 seconds has passed since the last flush (controlled by `settings.flushInterval`) + +There is a maximum of `500KB` per batch request and `32KB` per call. + +If you don't want to batch messages, you can turn batching off by setting the `maxEventsInBatch` setting to `1`, like so: + +```javascript +const analytics = new Analytics({ + ... + maxEventsInBatch: 1 +});``` + +Batching means that your message might not get sent right away. Every method call takes an optional `callback`, which you can use to know when a particular message is flushed from the queue, like so: + +```javascript +analytics.track({ + userId: '019mr8mf4r', + event: 'Ultimate Played', + }, + (err, ctx) => { + ... + } +) +``` + +You can also flush on demand. For example, at the end of your program, you need to flush to make sure that nothing is left in the queue. To do that, call the `flush` method: + +```javascript +analytics.flush(function(err, batch){ + console.log('Flushed, and now this program can exit!'); +}); +``` + + + + + + + +## Multiple Clients + +Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of `Analytics` with different settings: + +```javascript +const marketingAnalytics = new Analytics({ writeKey: 'MARKETING_WRITE_KEY' }); +const appAnalytics = new Analytics({ writeKey: 'APP_WRITE_KEY' }); +``` + + +## Troubleshooting + +{% include content/troubleshooting-intro.md %} +{% include content/troubleshooting-server-debugger.md %} +{% include content/troubleshooting-server-integration.md %} From 66a2ca8b0ff76f5f16a8a68b1927a14d4829e6f8 Mon Sep 17 00:00:00 2001 From: stayseesong Date: Fri, 16 Dec 2022 16:27:15 -0800 Subject: [PATCH 02/15] edits --- .../catalog/libraries/server/node/index.md | 7 +- .../catalog/libraries/server/node/next.md | 134 +++++++++++++++++- .../libraries/server/node/quickstart.md | 5 +- 3 files changed, 141 insertions(+), 5 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index e3d9f08e1c..46ea81e56d 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -1,11 +1,14 @@ --- -title: Analytics for Node.js +title: Analytics for Node.js Classic redirect_from: '/connections/sources/catalog/libraries/server/node-js/' repo: analytics-node strat: node-js --- -Our Node.js library lets you record analytics data from your node code. The requests hit our servers, and then we route your data to any destinations you have enabled. +> warning "Deprecation of Analytics Node.js Classic" +> On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to Analytics Node.js 2.0. See the [Analytics Node.js 2.0] docs to learn more. + +Segment's Node.js library lets you record analytics data from your node code. The requests hit our servers, and then Segment routes your data to any destinations you have enabled. The [Segment Node.js library is open-source](https://github.com/segmentio/analytics-node) on GitHub. diff --git a/src/connections/sources/catalog/libraries/server/node/next.md b/src/connections/sources/catalog/libraries/server/node/next.md index 2be6e66c77..9e93819bba 100644 --- a/src/connections/sources/catalog/libraries/server/node/next.md +++ b/src/connections/sources/catalog/libraries/server/node/next.md @@ -1,15 +1,19 @@ --- -title: Analytics for Node.js Next +title: Analytics for Node.js 2.0 repo: analytics-node strat: node-js --- -Segment's Analytics Node.js Next library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. +Segment's Analytics Node.js 2.0 library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. The [Segment Analytics Node.js Next library is open-source](https://github.com/segmentio/analytics-next){:target="_blank"} on GitHub. All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to Segment's servers. +> info "Using Analytics for Node.js Classic?" +> If you’re still using the classic version of Analytics for Node.js, you can refer to the documentation [here](/docs/connections/sources/catalog/libraries/server/node/). +>

On [date], Segment will end support for Analytics Node.js Classic, which includes versions [#] and older. Upgrade to Analytics Node.js 2.0. See the Analytics Node.js 2.0 docs to learn more. + ## Getting Started > warning "" @@ -430,6 +434,132 @@ analytics.on('identify', (ctx) => console.log(err)) analytics.on('track', (ctx) => console.log(ctx)) ``` +## Plugin architecture +When you develop against Analytics 2.0, the plugins you write can augment functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done. + +Though middlewares function the same as plugins, it's best to use plugins as they are easier to implement and are more testable. + +### Plugin categories +Plugins are bound by Analytics 2.0 which handles operations such as observability, retries, and error handling. There are two different categories of plugins: +* **Critical Plugins**: Analytics.js expects this plugin to be loaded before starting event delivery. Failure to load a critical plugin halts event delivery. Use this category sparingly, and only for plugins that are critical to your tracking. +* **Non-critical Plugins**: Analytics.js can start event delivery before this plugin finishes loading. This means your plugin can fail to load independently from all other plugins. For example, every Analytics.js destination is a non-critical plugin. This makes it possible for Analytics.js to continue working if a partner destination fails to load, or if users have ad blockers turned on that are targeting specific destinations. + +> info "" +> Non-critical plugins are only non-critical from a loading standpoint. For example, if the `before` plugin crashes, this can still halt the event delivery pipeline. + +Non-critical plugins run through a timeline that executes in order of insertion based on the entry type. Segment has these five entry types of non-critical plugins: + +| Type | Details | +| ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| `before` | Executes before event processing begins. These are plugins that run before any other plugins run.

For example, validating events before passing them along to other plugins. A failure here could halt the event pipeline.

See the example of how Analytics.js uses the [Event Validation plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/validation/index.ts){:target="_blank"} to verify that every event has the correct shape. | +| `enrichment` | Executes as the first level of event processing. These plugins modify an event.

See the example of how Analytics.js uses the [Page Enrichment plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/page-enrichment/index.ts){:target="_blank"} to enrich every event with page information. | +| `destination` | Executes as events begin to pass off to destinations.

This doesn't modify the event outside of the specific destination, and failure doesn't halt the execution. | +| `after` | Executes after all event processing completes. You can use this to perform cleanup operations.

An example of this is the [Segment.io Plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/segmentio/index.ts){:target="_blank"} which waits for destinations to succeed or fail so it can send it observability metrics. | +| `utility` | Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality. | + +### Example plugins +Here's an example of a plugin that converts all track event names to lowercase before the event goes through the rest of the pipeline: + +```js +export const lowercase: Plugin = { + name: 'Lowercase events', + type: 'enrichment', + version: '1.0.0', + + isLoaded: () => true, + load: () => Promise.resolve(), + + track: (ctx) => { + ctx.updateEvent('event', ctx.event.event.toLowerCase()) + return ctx + } +} + +const identityStitching = () => { + let user + + const identity = { + // Identifies your plugin in the Plugins stack. + // Access `window.analytics.queue.plugins` to see the full list of plugins + name: 'Identity Stitching', + // Defines where in the event timeline a plugin should run + type: 'enrichment', + version: '0.1.0', + + // use the `load` hook to bootstrap your plugin + // The load hook will receive a context object as its first argument + // followed by a reference to the analytics.js instance from the page + load: async (_ctx, ajs) => { + user = ajs.user() + }, + + // Used to signal that a plugin has been property loaded + isLoaded: () => user !== undefined, + + // Applies the plugin code to every `identify` call in Analytics.js + // You can override any of the existing types in the Segment Spec. + async identify(ctx) { + // Request some extra info to enrich your `identify` events from + // an external API. + const req = await fetch( + `https://jsonplaceholder.typicode.com/users/${ctx.event.userId}` + ) + const userReq = await req.json() + + // ctx.updateEvent can be used to update deeply nested properties + // in your events. It's a safe way to change events as it'll + // create any missing objects and properties you may require. + ctx.updateEvent('traits.custom', userReq) + user.traits(userReq) + + // Every plugin must return a `ctx` object, so that the event + // timeline can continue processing. + return ctx + }, + } + + return identity +} + +// Registers Segment's new plugin into Analytics.js +await window.analytics.register(identityStitching()) +``` + +Here's an example of a `utility` plugin that allows you to change the format of the anonymous_id cookie: + +```js + +window.analytics.ready(() => { + window.analytics.register({ + name: 'Cookie Compatibility', + version: '0.1.0', + type: 'utility', + load: (_ctx, ajs) => { + const user = ajs.user() + const cookieJar = user.cookies + const cookieSetter = cookieJar.set.bind(cookieJar) + + // blindly convert any values into JSON strings + cookieJar.set = (key, value, opts) => cookieSetter(key, JSON.stringify(value), opts) + + // stringify any existing IDs + user.anonymousId(user.anonymousId()) + user.id(user.id()) + }, + isLoaded: () => true + }) + }) +``` + +You can view Segment's [existing plugins](https://github.com/segmentio/analytics-next/tree/master/src/plugins){:target="_blank"} to see more examples. + +### Register a plugin +Registering plugins enable you to modify your analytics implementation to best fit your needs. You can register a plugin using this: + +```js +// A promise will resolve once the plugins have been successfully loaded into Analytics.js +// You can register multiple plugins at once by using the variable args interface in Analytics.js +await window.analytics.register(pluginA, pluginB, pluginN) ## Development diff --git a/src/connections/sources/catalog/libraries/server/node/quickstart.md b/src/connections/sources/catalog/libraries/server/node/quickstart.md index 508fde4ae9..6064c5a537 100644 --- a/src/connections/sources/catalog/libraries/server/node/quickstart.md +++ b/src/connections/sources/catalog/libraries/server/node/quickstart.md @@ -1,9 +1,12 @@ --- -title: 'Quickstart: Node.js' +title: 'Quickstart: Node.js Classic' redirect_from: '/connections/sources/catalog/libraries/server/node-js/quickstart/' strat: node-js --- +> warning "Deprecation of Analytics Node.js Classic" +> On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to Analytics Node.js 2.0. See the [Analytics Node.js 2.0] docs to learn more. + This tutorial will help you start sending data from your Node servers to Segment and any destination, using Segment's Node library. Check out the full documentation for [Analytics Node.js](/docs/connections/sources/catalog/libraries/server/node) to learn more. To get started with Analytics Node.js: From f217f61bc61d104549a62a7578743af996ae7975 Mon Sep 17 00:00:00 2001 From: stayseesong <83784848+stayseesong@users.noreply.github.com> Date: Thu, 5 Jan 2023 14:41:55 -0800 Subject: [PATCH 03/15] Apply suggestions from code review --- .../sources/catalog/libraries/server/node/index.md | 2 +- src/connections/sources/catalog/libraries/server/node/next.md | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index 46ea81e56d..70103f89fb 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -8,7 +8,7 @@ strat: node-js > warning "Deprecation of Analytics Node.js Classic" > On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to Analytics Node.js 2.0. See the [Analytics Node.js 2.0] docs to learn more. -Segment's Node.js library lets you record analytics data from your node code. The requests hit our servers, and then Segment routes your data to any destinations you have enabled. +Segment's Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. The [Segment Node.js library is open-source](https://github.com/segmentio/analytics-node) on GitHub. diff --git a/src/connections/sources/catalog/libraries/server/node/next.md b/src/connections/sources/catalog/libraries/server/node/next.md index 9e93819bba..09f0096955 100644 --- a/src/connections/sources/catalog/libraries/server/node/next.md +++ b/src/connections/sources/catalog/libraries/server/node/next.md @@ -6,7 +6,7 @@ strat: node-js Segment's Analytics Node.js 2.0 library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. -The [Segment Analytics Node.js Next library is open-source](https://github.com/segmentio/analytics-next){:target="_blank"} on GitHub. +The [Segment Analytics Node.js Next library is open-source](https://github.com/segmentio/analytics-next/tree/master/packages/node){:target="_blank"} on GitHub. All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to Segment's servers. @@ -429,7 +429,7 @@ The event emitter interface allows you to track when certain things happen in th ```javascript analytics.on('error', (err) => console.error(err)) -analytics.on('identify', (ctx) => console.log(err)) +analytics.on('identify', (ctx) => console.log(ctx)) analytics.on('track', (ctx) => console.log(ctx)) ``` From 4144486da6dae6176610498a9a35cf702d567aee Mon Sep 17 00:00:00 2001 From: stayseesong Date: Thu, 5 Jan 2023 16:42:42 -0800 Subject: [PATCH 04/15] edits --- .../catalog/libraries/server/node/index.md | 3 +- .../catalog/libraries/server/node/next.md | 149 ++---------------- 2 files changed, 16 insertions(+), 136 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index 70103f89fb..b066c664ef 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -3,10 +3,11 @@ title: Analytics for Node.js Classic redirect_from: '/connections/sources/catalog/libraries/server/node-js/' repo: analytics-node strat: node-js +hidden: true --- > warning "Deprecation of Analytics Node.js Classic" -> On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to Analytics Node.js 2.0. See the [Analytics Node.js 2.0] docs to learn more. +> On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to the new version of Analytics Node.js. See the updated [Analytics Node.js] docs to learn more. Segment's Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. diff --git a/src/connections/sources/catalog/libraries/server/node/next.md b/src/connections/sources/catalog/libraries/server/node/next.md index 09f0096955..62b760a6d2 100644 --- a/src/connections/sources/catalog/libraries/server/node/next.md +++ b/src/connections/sources/catalog/libraries/server/node/next.md @@ -41,18 +41,6 @@ Be sure to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can This will create an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node/#development). -### Regional configuration -For Business plans with access to [Regional Segment](/docs/guides/regional-segment), you can use the `host` configuration parameter to send data to the desired region: -1. Oregon (Default) — `api.segment.io/v1` -2. Dublin — `events.eu1.segmentapis.com` - -An example of setting the host to the EU endpoint using the Node library is: -```javascript -const analytics = new Analytics({ - ... - host: "https://events.eu1.segmentapis.com" -}); -``` ## Basic tracking methods The basic tracking methods below serve as the building blocks of your Segment tracking. They include [Identify](#identify), [Track](#track), [Page](#page), [Group](#group), and [Alias](#alias). @@ -286,7 +274,7 @@ The `group` call has the following fields: `traits` _dict, optional_ - A dict of traits you know about the group. For a company, they might be things like `name`, `address`, or `phone`. + A dict of traits you know about the group. For a company, they might be things like `name`, `address`, or `phone`. [Learn more about traits](/docs/connections/spec/group/#traits). `context` _dict, optional_ @@ -412,6 +400,19 @@ await analytics.closeAndFlush() console.log(unflushedEvents) // all events that came in after closeAndFlush was called ``` +## Regional configuration +For Business plans with access to [Regional Segment](/docs/guides/regional-segment), you can use the `host` configuration parameter to send data to the desired region: +1. Oregon (Default) — `api.segment.io/v1` +2. Dublin — `events.eu1.segmentapis.com` + +An example of setting the host to the EU endpoint using the Node library is: +```javascript +const analytics = new Analytics({ + ... + host: "https://events.eu1.segmentapis.com" +}); +``` + ## Error handling To keep track of errors, subscribe and log all event delivery errors by running: @@ -486,13 +487,6 @@ const identityStitching = () => { type: 'enrichment', version: '0.1.0', - // use the `load` hook to bootstrap your plugin - // The load hook will receive a context object as its first argument - // followed by a reference to the analytics.js instance from the page - load: async (_ctx, ajs) => { - user = ajs.user() - }, - // Used to signal that a plugin has been property loaded isLoaded: () => user !== undefined, @@ -521,34 +515,6 @@ const identityStitching = () => { return identity } -// Registers Segment's new plugin into Analytics.js -await window.analytics.register(identityStitching()) -``` - -Here's an example of a `utility` plugin that allows you to change the format of the anonymous_id cookie: - -```js - -window.analytics.ready(() => { - window.analytics.register({ - name: 'Cookie Compatibility', - version: '0.1.0', - type: 'utility', - load: (_ctx, ajs) => { - const user = ajs.user() - const cookieJar = user.cookies - const cookieSetter = cookieJar.set.bind(cookieJar) - - // blindly convert any values into JSON strings - cookieJar.set = (key, value, opts) => cookieSetter(key, JSON.stringify(value), opts) - - // stringify any existing IDs - user.anonymousId(user.anonymousId()) - user.id(user.id()) - }, - isLoaded: () => true - }) - }) ``` You can view Segment's [existing plugins](https://github.com/segmentio/analytics-next/tree/master/src/plugins){:target="_blank"} to see more examples. @@ -560,16 +526,8 @@ Registering plugins enable you to modify your analytics implementation to best f // A promise will resolve once the plugins have been successfully loaded into Analytics.js // You can register multiple plugins at once by using the variable args interface in Analytics.js await window.analytics.register(pluginA, pluginB, pluginN) - -## Development - -You can use this initialization during development to make the library flush every time a message is submitted, so that you can be sure your calls are working properly before pushing to production. - -```javascript -var analytics = new Analytics('YOUR_WRITE_KEY', { flushAt: 1 }); ``` - ## Selecting Destinations The `alias`, `group`, `identify`, `page` and `track` calls can all be passed an object of `integrations` that lets you turn certain destinations on or off. By default all destinations are enabled. @@ -650,85 +608,6 @@ analytics.flush(function(err, batch){ }); ``` - - - - - - ## Multiple Clients Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of `Analytics` with different settings: From ad10607ed65aeebeb9d296ced23d559706000e29 Mon Sep 17 00:00:00 2001 From: stayseesong Date: Fri, 6 Jan 2023 17:07:47 -0800 Subject: [PATCH 05/15] more edits --- .../catalog/libraries/server/node/classic.md | 484 ++++++++++++++ .../catalog/libraries/server/node/index.md | 611 ++++++++--------- .../catalog/libraries/server/node/next.md | 625 ------------------ 3 files changed, 777 insertions(+), 943 deletions(-) create mode 100644 src/connections/sources/catalog/libraries/server/node/classic.md delete mode 100644 src/connections/sources/catalog/libraries/server/node/next.md diff --git a/src/connections/sources/catalog/libraries/server/node/classic.md b/src/connections/sources/catalog/libraries/server/node/classic.md new file mode 100644 index 0000000000..bbe4075540 --- /dev/null +++ b/src/connections/sources/catalog/libraries/server/node/classic.md @@ -0,0 +1,484 @@ +--- +title: Analytics for Node.js Classic +repo: analytics-node +strat: node-js +hidden: true +--- + +> warning "Deprecation of Analytics Node.js Classic" +> On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to the new version of Analytics Node.js. See the updated [Analytics Node.js] docs to learn more. + +Segment's Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. + +The [Segment Node.js library is open-source](https://github.com/segmentio/analytics-node) on GitHub. + +All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to our servers. + +Want to stay updated on releases? Subscribe to the [release feed](https://github.com/segmentio/analytics-node/releases.atom). + +## Getting Started + +Run: + +```bash +npm install --save analytics-node +``` + +This will add our Node library module to your `package.json`. The module exposes an `Analytics` constructor, which you need to initialize with your Segment source's **Write Key**, like so: + +```javascript +var Analytics = require('analytics-node'); +var analytics = new Analytics('YOUR_WRITE_KEY'); +``` + +Of course, you'll want to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can find in Segment under your source settings. + +This will create an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node/#development). + +### Regional configuration +For Business plans with access to [Regional Segment](/docs/guides/regional-segment), you can use the `host` configuration parameter to send data to the desired region: +1. Oregon (Default) — `api.segment.io/v1` +2. Dublin — `events.eu1.segmentapis.com` + +An example of setting the host to the EU endpoint using the Node library would be: +```javascript +var analytics = new Analytics('YOUR_WRITE_KEY', { + host: "https://events.eu1.segmentapis.com" + }); +``` + +## Identify + +> note "" +> **Good to know**: For any of the different methods described on this page, you can replace the properties and traits in the code samples with variables that represent the data collected. + +`identify` lets you tie a user to their actions and record traits about them. It includes a unique User ID and/or anonymous ID, and any optional traits you know about them. + +You should call `identify` once when the user's account is first created, and then again any time their traits change. + +Example of an anonymous `identify` call: + +```javascript +analytics.identify({ + anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe', + traits: { + friends: 42 + } +}); +``` + +This call identifies the user and records their unique anonymous ID, and labels them with the `friends` trait. + +Example of an `identify` call for an identified user: + +```javascript +analytics.identify({ + userId: '019mr8mf4r', + traits: { + name: 'Michael Bolton', + email: 'mbolton@example.com', + plan: 'Enterprise', + friends: 42 + } +}); +``` +The call above identifies Michael by his unique User ID (the one you know him by in your database), and labels him with the `name`, `email`, `plan` and `friends` traits. + +The `identify` call has the following fields: + +Field | Details +----- | ------- +`userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any identify call._ +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all identify calls._ +`traits` _Object, optional_ | A dictionary of [traits](/docs/connections/spec/identify#traits) you know about the user. Things like: `email`, `name` or `friends`. +`timestamp` _Date, optional_ | A JavaScript date object representing when the identify took place. If the identify just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ + +Find details on the **identify method payload** in the Segment [Spec](/docs/connections/spec/identify/). + +## Track + +`track` lets you record the actions your users perform. Every action triggers what we call an "event", which can also have associated properties. + +You'll want to track events that are indicators of success for your site, like **Signed Up**, **Item Purchased** or **Article Bookmarked**. + +To get started, we recommend tracking just a few important events. You can always add more later! + +Example anonymous `track` call: + +```javascript +analytics.track({ + anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe', + event: 'Item Purchased', + properties: { + revenue: 39.95, + shippingMethod: '2-day' + } +}); +``` + +Example identified `track` call: + +```javascript +analytics.track({ + userId: '019mr8mf4r', + event: 'Item Purchased', + properties: { + revenue: 39.95, + shippingMethod: '2-day' + } +}); +``` + +This example `track` call tells us that your user just triggered the **Item Purchased** event with a revenue of $39.95 and chose your hypothetical '2-day' shipping. + +`track` event properties can be anything you want to record. In this case, revenue and shipping method. + +The `track` call has the following fields: + +Field | Details +----- | -------- +`userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any track call. +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all track calls._ +`event` _String_ | The name of the event you're tracking. We recommend human-readable names like `Song Played` or `Status Updated`. +`properties` _Object, optional_ | A dictionary of properties for the event. If the event was `Product Added`, it might have properties like `price` or `product`. +`timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ + +Find details on **best practices in event naming** as well as the **`track` method payload** in the Segment [Spec](/docs/connections/spec/track/). + +## Page + +The [`page`](/docs/connections/spec/page/) method lets you record page views on your website, along with optional extra information about the page being viewed. + +If you're using our client-side set up in combination with the Node.js library, page calls are **already tracked for you** by default. However, if you want to record your own page views manually and aren't using our client-side library, read on! + +Example `page` call: + +```js +analytics.page({ + userId: '019mr8mf4r', + category: 'Docs', + name: 'Node.js Library', + properties: { + url: 'https://segment.com/docs/connections/sources/catalog/librariesnode', + path: '/docs/connections/sources/catalog/librariesnode/', + title: 'Node.js Library - Segment', + referrer: 'https://github.com/segmentio/analytics-node' + } +}); +``` + +The `page` call has the following fields: + +Field | Details +----- | -------- +`userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any page call. +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any page call._ +`category` _String, optional_ | The category of the page. Useful for things like ecommerce where many pages often live under a larger category. +`name` _String, optional_ | The name of the page, for example **Signup** or **Home**. +`properties` _Object, optional_ | A dictionary of properties of the page. A few properties specially recognized and automatically translated: `url`, `title`, `referrer` and `path`, but you can add your own too. +`timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ + +Find details on the **`page` payload** in the Segment [Spec](/docs/connections/spec/page/). + +## Group + +`group` lets you associate an [identified user](/docs/connections/sources/catalog/libraries/server/node/#identify) with a group. A group could be a company, organization, account, project or team! It also lets you record custom traits about the group, like industry or number of employees. + +This is useful for tools like [Intercom](/docs/connections/destinations/catalog/intercom/), [Preact](/docs/connections/destinations/catalog/preact/) and [Totango](/docs/connections/destinations/catalog/totango/), as it ties the user to a **group** of other users. + +Example `group` call: + +```javascript +analytics.group({ + userId: '019mr8mf4r', + groupId: '56', + traits: { + name: 'Initech', + description: 'Accounting Software' + } +}); +``` + +The `group` call has the following fields: + +Field | Details +----- | -------- +`userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any group call. +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any group call._ +`groupId` _string | The ID of the group. +`traits` _dict, optional_ | A dict of traits you know about the group. For a company, they might be things like `name`, `address`, or `phone`. [Learn more about traits](/docs/connections/spec/group/#traits). +`context` _dict, optional_ | A dict containing any context about the request. To see the full reference of supported keys, check them out in the [context reference](/docs/connections/spec/common/#context) +`timestamp` _datetime, optional_ | A `datetime` object representing when the group took place. If the group just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you send `timestamp`. +`integrations` _dict, optional_ | A dictionary of destinations to enable or disable. + +Find more details about `group`, including the **`group` payload**, in the Segment [Spec](/docs/connections/spec/group/). + +## Alias + +The `alias` call allows you to associate one identity with another. This is an advanced method and should not be widely used, but is required to manage user identities in _some_ destinations. Other destinations do not support the alias call. + +In [Mixpanel](/docs/connections/destinations/catalog/mixpanel/#alias) it's used to associate an anonymous user with an identified user once they sign up. For [Kissmetrics](/docs/connections/destinations/catalog/kissmetrics/#alias), if your user switches IDs, you can use 'alias' to rename the 'userId'. + +Example `alias` call: + +```javascript +analytics.alias({ + previousId: 'old_id', + userId: 'new_id' +}); +``` + +The `alias` call has the following fields: + +Field | Details +----- | -------- +`userId` _String_ | The ID for this user in your database. +`previousId` _String_ | The previous ID to alias from. + +Here's a full example of how we might use the `alias` call: + +```javascript +// the anonymous user does actions ... +analytics.track({ userId: 'anonymous_user', event: 'Anonymous Event' }) +// the anonymous user signs up and is aliased +analytics.alias({ previousId: 'anonymous_user', userId: 'identified@example.com' }) +// the identified user is identified +analytics.identify({ userId: 'identified@example.com', traits: { plan: 'Free' } }) +// the identified user does actions ... +analytics.track({ userId: 'identified@example.com', event: 'Identified Action' }) +``` + +For more details about `alias`, including the **`alias` call payload**, check out our [Spec](/docs/connections/spec/alias/). + +--- + + +## Configuration + +The second argument to the `Analytics` constructor is an optional dictionary of settings to configure the module. + +```javascript +var analytics = new Analytics('YOUR_WRITE_KEY', { + flushAt: 20, + flushInterval: 10000, + enable: false +}); +``` + +Setting | Details +------- | -------- +`flushAt` _Number_ | The number of messages to enqueue before flushing. +`flushInterval` _Number_ | The number of milliseconds to wait before flushing the queue automatically. +`enable` _Boolean_ | Enable (default) or disable flush. Useful when writing tests and you do not want to send data to Segment Servers. + + +### Error Handling + +Additionally there is an optional `errorHandler` property available to the class constructor's options. +If unspecified, the behaviour of the library does not change. +If specified, when an axios request fails, `errorHandler(axiosError)` will be called instead of re-throwing the axios error. + +Example usage: +```javascript +const Analytics = require('analytics-node'); + +const client = new Analytics('write key', { + errorHandler: (err) => { + console.error('analytics-node flush failed.') + console.error(err) + } +}); + +client.track({ + event: 'event name', + userId: 'user id' +}); + +``` +If this fails when flushed no exception will be thrown, instead the axios error will be logged to the console. + +## Development + +You can use this initialization during development to make the library flush every time a message is submitted, so that you can be sure your calls are working properly before pushing to production. + +```javascript +var analytics = new Analytics('YOUR_WRITE_KEY', { flushAt: 1 }); +``` + + +## Selecting Destinations + +The `alias`, `group`, `identify`, `page` and `track` calls can all be passed an object of `integrations` that lets you turn certain destinations on or off. By default all destinations are enabled. + +Here's an example with the `integrations` object shown: + +```javascript +analytics.track({ + event: 'Membership Upgraded', + userId: '97234974', + integrations: { + 'All': false, + 'Vero': true, + 'Google Analytics': false + } +}) +``` + +In this case, we're specifying that we want this `track` to only go to Vero. `All: false` says that no destination should be enabled unless otherwise specified. `Vero: true` turns on Vero, etc. + +Destination flags are **case sensitive** and match [the destination's name in the docs](/docs/connections/destinations/) (i.e. "AdLearn Open Platform", "awe.sm", "MailChimp", etc.). In some cases, there may be several names for a destination; if that happens you'll see a "Adding (destination name) to the Integrations Object" section in the destination's doc page with a list of valid names. + +**Note:** + +- Available at the business level, filtering track calls can be done right from the Segment UI on your source schema page. We recommend using the UI if possible since it's a much simpler way of managing your filters and can be updated with no code changes on your side. + +- If you are on a grandfathered plan, events sent server-side that are filtered through the Segment dashboard will still count towards your API usage. + +## Historical Import + +You can import historical data by adding the `timestamp` argument to any of your method calls. This can be helpful if you've just switched to Segment. + +Historical imports can only be done into destinations that can accept historical timestamped data. Most analytics tools like Mixpanel, Amplitude, Kissmetrics, etc. can handle that type of data just fine. One common destination that does not accept historical data is Google Analytics since their API cannot accept historical data. + +**Note:** If you're tracking things that are happening right now, leave out the `timestamp` and our servers will timestamp the requests for you. + + +## Batching + +Our libraries are built to support high performance environments. That means it is safe to use our Node library on a web server that's serving hundreds of requests per second. + +Every method you call **does not** result in an HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation. + +By default, our library will flush: + + - The very first time it gets a message. + - Every 20 messages (controlled by `options.flushAt`). + - If 10 seconds has passed since the last flush (controlled by `options.flushInterval`) + +There is a maximum of `500KB` per batch request and `32KB` per call. + +If you don't want to batch messages, you can turn batching off by setting the `flushAt` option to `1`, like so: + +```javascript +var analytics = new Analytics('YOUR_WRITE_KEY', { flushAt: 1 }); +``` + +Batching means that your message might not get sent right away. But every method call takes an optional `callback`, which you can use to know when a particular message is flushed from the queue, like so: + +```javascript +analytics.track({ + userId: '019mr8mf4r', + event: 'Ultimate Played' +}, function(err, batch){ + if (err) // There was an error flushing your message... + // Your message was successfully flushed! +}); +``` + +You can also flush on demand. For example, at the end of your program, you need to flush to make sure that nothing is left in the queue. To do that, call the `flush` method: + +```javascript +analytics.flush(function(err, batch){ + console.log('Flushed, and now this program can exit!'); +}); +``` + +## Long running process + +You should call `client.track(...)` and know that events will be queued and eventually sent to Segment. To prevent losing messages, be sure to capture any interruption (for example, a server restart) and call flush to know of and delay the process shutdown. + +```js +import { randomUUID } from 'crypto'; +import Analytics from 'analytics-node' + +const WRITE_KEY = '...'; + +const analytics = new Analytics(WRITE_KEY, { flushAt: 10 }); + +analytics.track({ + anonymousId: randomUUID(), + event: 'Test event', + properties: { + name: 'Test event', + timestamp: new Date() + } +}); + +const exitGracefully = async (code) => { + console.log('Flushing events'); + await analytics.flush(function(err, batch) { + console.log('Flushed, and now this program can exit!'); + process.exit(code); + }); +}; + +[ + 'beforeExit', 'uncaughtException', 'unhandledRejection', + 'SIGHUP', 'SIGINT', 'SIGQUIT', 'SIGILL', 'SIGTRAP', + 'SIGABRT','SIGBUS', 'SIGFPE', 'SIGUSR1', 'SIGSEGV', + 'SIGUSR2', 'SIGTERM', +].forEach(evt => process.on(evt, exitGracefully)); + +function logEvery2Seconds(i) { + setTimeout(() => { + console.log('Infinite Loop Test n:', i); + logEvery2Seconds(++i); + }, 2000); +} + +logEvery2Seconds(0); +``` + +## Short lived process + +Short-lived functions have a predictably short and linear lifecycle, so use a queue big enough to hold all messages and then await flush to complete its work. + + +```js +import { randomUUID } from 'crypto'; +import Analytics from 'analytics-node' + + +async function lambda() +{ + const WRITE_KEY = '...'; + const analytics = new Analytics(WRITE_KEY, { flushAt: 20 }); + analytics.flushed = true; + + analytics.track({ + anonymousId: randomUUID(), + event: 'Test event', + properties: { + name: 'Test event', + timestamp: new Date() + } + }); + await analytics.flush(function(err, batch) { + console.log('Flushed, and now this program can exit!'); + }); +} + +lambda(); +``` + + +## Multiple Clients + +Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of `Analytics` with different settings: + +```javascript +var Analytics = require('analytics-node'); +var marketingAnalytics = new Analytics('MARKETING_WRITE_KEY'); +var appAnalytics = new Analytics('APP_WRITE_KEY'); +``` + + +## Troubleshooting + +{% include content/troubleshooting-intro.md %} +{% include content/troubleshooting-server-debugger.md %} +{% include content/troubleshooting-server-integration.md %} + diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index b066c664ef..ff4141c812 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -1,57 +1,62 @@ --- -title: Analytics for Node.js Classic +title: Analytics for Node.js redirect_from: '/connections/sources/catalog/libraries/server/node-js/' repo: analytics-node strat: node-js -hidden: true --- -> warning "Deprecation of Analytics Node.js Classic" -> On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to the new version of Analytics Node.js. See the updated [Analytics Node.js] docs to learn more. +Segment's Analytics Node.js 2.0 library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. -Segment's Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. +The [Segment Analytics Node.js Next library is open-source](https://github.com/segmentio/analytics-next/tree/master/packages/node){:target="_blank"} on GitHub. -The [Segment Node.js library is open-source](https://github.com/segmentio/analytics-node) on GitHub. +All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to Segment's servers. -All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to our servers. - -Want to stay updated on releases? Subscribe to the [release feed](https://github.com/segmentio/analytics-node/releases.atom). +> info "Using Analytics for Node.js Classic?" +> If you’re still using the classic version of Analytics for Node.js, you can refer to the documentation [here](/docs/connections/sources/catalog/libraries/server/node/classic). +>

On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to Analytics Node.js 2.0. See the Analytics Node.js 2.0 docs to learn more. ## Getting Started -Run: +> warning "" +> Make sure you're using a version of Node that's 14 or higher. -```bash -npm install --save analytics-node -``` +1. Run the following to add Segment's Node library module to your `package.json`. -This will add our Node library module to your `package.json`. The module exposes an `Analytics` constructor, which you need to initialize with your Segment source's **Write Key**, like so: + ```bash + # npm + npm install @segment/analytics-node + # yarn + yarn add @segment/analytics-node + # pnpm + pnpm install @segment/analytics-node + ``` -```javascript -var Analytics = require('analytics-node'); -var analytics = new Analytics('YOUR_WRITE_KEY'); -``` +2. Initialize the `Analytics` constructor the module exposes with your Segment source **Write Key**, like so: -Of course, you'll want to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can find in Segment under your source settings. + ```javascript + import { Analytics } from '@segment/analytics-node' + // or, if you use require: + const { Analytics } = require('@segment/analytics-node') -This will create an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node/#development). + // instantiation + const analytics = new Analytics({ writeKey: '' }) + ``` -### Regional configuration -For Business plans with access to [Regional Segment](/docs/guides/regional-segment), you can use the `host` configuration parameter to send data to the desired region: -1. Oregon (Default) — `api.segment.io/v1` -2. Dublin — `events.eu1.segmentapis.com` + Be sure to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can find in Segment under your source settings. -An example of setting the host to the EU endpoint using the Node library would be: -```javascript -var analytics = new Analytics('YOUR_WRITE_KEY', { - host: "https://events.eu1.segmentapis.com" - }); -``` + This creates an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node/#development). + + +## Basic tracking methods +The basic tracking methods below serve as the building blocks of your Segment tracking. They include [Identify](#identify), [Track](#track), [Page](#page), [Group](#group), and [Alias](#alias). -## Identify +These methods correspond with those used in the [Segment Spec](/docs/connections/spec/). The documentation on this page explains how to use these methods in Analytics Node.js Next. -> note "" -> **Good to know**: For any of the different methods described on this page, you can replace the properties and traits in the code samples with variables that represent the data collected. + +### Identify + +> info "Good to know" +> For any of the different methods described on this page, you can replace the properties and traits in the code samples with variables that represent the data collected. `identify` lets you tie a user to their actions and record traits about them. It includes a unique User ID and/or anonymous ID, and any optional traits you know about them. @@ -87,32 +92,17 @@ The call above identifies Michael by his unique User ID (the one you know him by The `identify` call has the following fields: - - - - - - - - - - - - - - - - - - - - - -
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any identify call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all identify calls._
`traits` _Object, optional_A dictionary of [traits](/docs/connections/spec/identify#traits) you know about the user. Things like: `email`, `name` or `friends`.
`timestamp` _Date, optional_A JavaScript date object representing when the identify took place. If the identify just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
- -Find details on the **identify method payload** in our [Spec](/docs/connections/spec/identify/). - -## Track +Field | Details +----- | ------- +`userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any identify call._ +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all identify calls._ +`traits` _Object, optional_ | A dictionary of [traits](/docs/connections/spec/identify#traits) you know about the user. Things like: `email`, `name` or `friends`. +`timestamp` _Date, optional_ | A JavaScript date object representing when the identify took place. If the identify just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ + +Find details on the **identify method payload** in Segment's [Spec](/docs/connections/spec/identify/). + +### Track `track` lets you record the actions your users perform. Every action triggers what we call an "event", which can also have associated properties. @@ -152,36 +142,18 @@ This example `track` call tells us that your user just triggered the **Item Purc The `track` call has the following fields: - - - - - - - - - - - - - - - - - - - - - - - - - -
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any track call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all track calls._
`event` _String_The name of the event you're tracking. We recommend human-readable names like `Song Played` or `Status Updated`.
`properties` _Object, optional_A dictionary of properties for the event. If the event was `Product Added`, it might have properties like `price` or `product`.
`timestamp` _Date, optional_A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
- -Find details on **best practices in event naming** as well as the **`track` method payload** in our [Spec](/docs/connections/spec/track/). - -## Page +Field | Details +----- | -------- +`userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any track call. +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all track calls._ +`event` _String_ | The name of the event you're tracking. We recommend human-readable names like `Song Played` or `Status Updated`. +`properties` _Object, optional_ | A dictionary of properties for the event. If the event was `Product Added`, it might have properties like `price` or `product`. +`timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ + +Find details on **best practices in event naming** as well as the **`track` method payload** in the Segment [Spec](/docs/connections/spec/track/). + +### Page The [`page`](/docs/connections/spec/page/) method lets you record page views on your website, along with optional extra information about the page being viewed. @@ -205,40 +177,19 @@ analytics.page({ The `page` call has the following fields: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any page call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any page call._
`category` _String, optional_The category of the page. Useful for things like ecommerce where many pages often live under a larger category.
`name` _String, optional_The name of the page, for example **Signup** or **Home**.
`properties` _Object, optional_A dictionary of properties of the page. A few properties specially recognized and automatically translated: `url`, `title`, `referrer` and `path`, but you can add your own too!
`timestamp` _Date, optional_A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
- -Find details on the **`page` payload** in our [Spec](/docs/connections/spec/page/). - -## Group +Field | Details +----- | -------- +`userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any page call. +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any page call._ +`category` _String, optional_ | The category of the page. Useful for things like ecommerce where many pages often live under a larger category. +`name` _String, optional_ | The name of the page, for example **Signup** or **Home**. +`properties` _Object, optional_ | A dictionary of properties of the page. A few properties specially recognized and automatically translated: `url`, `title`, `referrer` and `path`, but you can add your own too. +`timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ + +Find details on the **`page` payload** in the Segment [Spec](/docs/connections/spec/page/). + +### Group `group` lets you associate an [identified user](/docs/connections/sources/catalog/libraries/server/node/#identify) with a group. A group could be a company, organization, account, project or team! It also lets you record custom traits about the group, like industry or number of employees. @@ -259,40 +210,19 @@ analytics.group({ The `group` call has the following fields: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any group call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any group call._
`groupId` _stringThe ID of the group.
`traits` _dict, optional_A dict of traits you know about the group. For a company, they might be things like `name`, `address`, or `phone`.
`context` _dict, optional_A dict containing any context about the request. To see the full reference of supported keys, check them out in the [context reference](/docs/connections/spec/common/#context)
`timestamp` _datetime, optional_A `datetime` object representing when the group took place. If the group just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you send `timestamp`.
`integrations` _dict, optional_A dictionary of destinations to enable or disable
- -Find more details about `group`, including the **`group` payload**, in our [Spec](/docs/connections/spec/group/). - -## Alias +Field | Details +----- | -------- +`userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any group call. +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any group call._ +`groupId` _string | The ID of the group. +`traits` _dict, optional_ | A dict of traits you know about the group. For a company, they might be things like `name`, `address`, or `phone`. [Learn more about traits](/docs/connections/spec/group/#traits). +`context` _dict, optional_ | A dict containing any context about the request. To see the full reference of supported keys, check them out in the [context reference](/docs/connections/spec/common/#context) +`timestamp` _datetime, optional_ | A `datetime` object representing when the group took place. If the group just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you send `timestamp`. +`integrations` _dict, optional_ | A dictionary of destinations to enable or disable. + +Find more details about `group`, including the **`group` payload**, in the Segment [Spec](/docs/connections/spec/group/). + +### Alias The `alias` call allows you to associate one identity with another. This is an advanced method and should not be widely used, but is required to manage user identities in _some_ destinations. Other destinations do not support the alias call. @@ -309,18 +239,12 @@ analytics.alias({ The `alias` call has the following fields: - - - - - - - - - -
`userId` _String_The ID for this user in your database.
`previousId` _String_The previous ID to alias from.
+Field | Details +----- | -------- +`userId` _String_ | The ID for this user in your database. +`previousId` _String_ | The previous ID to alias from. -Here's a full example of how we might use the `alias` call: +Here's a full example of how Segment might use the `alias` call: ```javascript // the anonymous user does actions ... @@ -333,71 +257,207 @@ analytics.identify({ userId: 'identified@example.com', traits: { plan: 'Free' } analytics.track({ userId: 'identified@example.com', event: 'Identified Action' }) ``` -For more details about `alias`, including the **`alias` call payload**, check out our [Spec](/docs/connections/spec/alias/). +For more details about `alias`, including the **`alias` call payload**, check out the Segment [Spec](/docs/connections/spec/alias/). --- ## Configuration -The second argument to the `Analytics` constructor is an optional dictionary of settings to configure the module. +The second argument to the `Analytics` constructor is an optional list of settings to configure the module. ```javascript -var analytics = new Analytics('YOUR_WRITE_KEY', { - flushAt: 20, - flushInterval: 10000, - enable: false -}); +const analytics = new Analytics({ + writeKey: '', + host: 'https://api.segment.io', + path: '/v1/batch', + maxRetries: 3, + maxEventsInBatch: 15, + flushInterval: 10000, + // ... and more! + }) ``` - - - - - - - - - - - - - -
`flushAt` _Number_The number of messages to enqueue before flushing.
`flushInterval` _Number_The number of milliseconds to wait before flushing the queue automatically.
`enable` _Boolean_Enable (default) or disable flush. Useful when writing tests and you do not want to send data to Segment Servers.
- -### Error Handling - -Additionally there is an optional `errorHandler` property available to the class constructor's options. -If unspecified, the behaviour of the library does not change. -If specified, when an axios request fails, `errorHandler(axiosError)` will be called instead of re-throwing the axios error. - -Example usage: +Setting | Details +------- | -------- +`writeKey` _string_ | The key that corresponds to your Segment.io project +`host` _string_ | The base URL of the API. The default is: "https://api.segment.io" +`path` _string_ | The API path route. The default is: "/v1/batch" +`maxRetries` _number_ | The number of times to retry flushing a batch. The default is: `3` +`maxEventsInBatch` _number_ | The number of messages to enqueue before flushing. The default is: `15` +`flushInterval` _number_ | The number of milliseconds to wait before flushing the queue automatically. The default is: `10000` +`httpRequestTimeout` | The maximum number of milliseconds to wait for an http request. The default is: `10000` + +### Graceful shutdown +Avoid losing events after shutting down your console. Call `.closeAndFlush()` to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved. + ```javascript -const Analytics = require('analytics-node'); +await analytics.closeAndFlush() +// or +await analytics.closeAndFlush({ timeout: 5000 }) // force resolve after 5000ms +``` -const client = new Analytics('write key', { - errorHandler: (err) => { - console.error('analytics-node flush failed.') - console.error(err) - } -}); +Here's an example of how to use graceful shutdown: +```javascript +const app = express() +const server = app.listen(3000) + +const onExit = async () => { + await analytics.closeAndFlush() + server.close(() => { + console.log("Gracefully closing server...") + process.exit() + }) +} +['SIGINT', 'SIGTERM'].forEach((code) => process.on(code, onExit)) +``` + +### Collect unflushed events +If you need to preserve all of your events in the instance of a forced timeout, even ones that came in after analytics.closeAndFlush() was called, you can still collect those events by using: + +```javascript +const unflushedEvents = [] + +analytics.on('call_after_close', (event) => unflushedEvents.push(events)) +await analytics.closeAndFlush() + +console.log(unflushedEvents) // all events that came in after closeAndFlush was called +``` + +## Regional configuration +For Business plans with access to [Regional Segment](/docs/guides/regional-segment), you can use the `host` configuration parameter to send data to the desired region: +1. Oregon (Default) — `api.segment.io/v1` +2. Dublin — `events.eu1.segmentapis.com` -client.track({ - event: 'event name', - userId: 'user id' +An example of setting the host to the EU endpoint using the Node library is: +```javascript +const analytics = new Analytics({ + ... + host: "https://events.eu1.segmentapis.com" }); +``` + +## Error handling + +To keep track of errors, subscribe and log all event delivery errors by running: +```javascript +const analytics = new Analytics({ writeKey: '' }) + +analytics.on('error', (err) => console.error(err)) ``` -If this fails when flushed no exception will be thrown, instead the axios error will be logged to the console. -## Development -You can use this initialization during development to make the library flush every time a message is submitted, so that you can be sure your calls are working properly before pushing to production. +### Event emitter interface +The event emitter interface allows you to track when certain things happen in the app, such as a track call or an error, and it will call the function you provided with some arguments when that event happens. ```javascript -var analytics = new Analytics('YOUR_WRITE_KEY', { flushAt: 1 }); +analytics.on('error', (err) => console.error(err)) + +analytics.on('identify', (ctx) => console.log(ctx)) + +analytics.on('track', (ctx) => console.log(ctx)) +``` + +## Plugin architecture +When you develop against Analytics 2.0, the plugins you write can augment functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done. + +Though middlewares function the same as plugins, it's best to use plugins as they are easier to implement and are more testable. + +### Plugin categories +Plugins are bound by Analytics 2.0 which handles operations such as observability, retries, and error handling. There are two different categories of plugins: +* **Critical Plugins**: Analytics.js expects this plugin to be loaded before starting event delivery. Failure to load a critical plugin halts event delivery. Use this category sparingly, and only for plugins that are critical to your tracking. +* **Non-critical Plugins**: Analytics.js can start event delivery before this plugin finishes loading. This means your plugin can fail to load independently from all other plugins. For example, every Analytics.js destination is a non-critical plugin. This makes it possible for Analytics.js to continue working if a partner destination fails to load, or if users have ad blockers turned on that are targeting specific destinations. + +> info "" +> Non-critical plugins are only non-critical from a loading standpoint. For example, if the `before` plugin crashes, this can still halt the event delivery pipeline. + +Non-critical plugins run through a timeline that executes in order of insertion based on the entry type. Segment has these five entry types of non-critical plugins: + +| Type | Details | +| ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| `before` | Executes before event processing begins. These are plugins that run before any other plugins run.

For example, validating events before passing them along to other plugins. A failure here could halt the event pipeline.

See the example of how Analytics.js uses the [Event Validation plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/validation/index.ts){:target="_blank"} to verify that every event has the correct shape. | +| `enrichment` | Executes as the first level of event processing. These plugins modify an event.

See the example of how Analytics.js uses the [Page Enrichment plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/page-enrichment/index.ts){:target="_blank"} to enrich every event with page information. | +| `destination` | Executes as events begin to pass off to destinations.

This doesn't modify the event outside of the specific destination, and failure doesn't halt the execution. | +| `after` | Executes after all event processing completes. You can use this to perform cleanup operations.

An example of this is the [Segment.io Plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/segmentio/index.ts){:target="_blank"} which waits for destinations to succeed or fail so it can send it observability metrics. | +| `utility` | Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality. | + +### Example plugins +Here's an example of a plugin that converts all track event names to lowercase before the event goes through the rest of the pipeline: + +```js +export const lowercase: Plugin = { + name: 'Lowercase events', + type: 'enrichment', + version: '1.0.0', + + isLoaded: () => true, + load: () => Promise.resolve(), + + track: (ctx) => { + ctx.updateEvent('event', ctx.event.event.toLowerCase()) + return ctx + } +} + +const identityStitching = () => { + let user + + const identity = { + // Identifies your plugin in the Plugins stack. + // Access `window.analytics.queue.plugins` to see the full list of plugins + name: 'Identity Stitching', + // Defines where in the event timeline a plugin should run + type: 'enrichment', + version: '0.1.0', + + // Used to signal that a plugin has been property loaded + isLoaded: () => user !== undefined, + + // Applies the plugin code to every `identify` call in Analytics.js + // You can override any of the existing types in the Segment Spec. + async identify(ctx) { + // Request some extra info to enrich your `identify` events from + // an external API. + const req = await fetch( + `https://jsonplaceholder.typicode.com/users/${ctx.event.userId}` + ) + const userReq = await req.json() + + // ctx.updateEvent can be used to update deeply nested properties + // in your events. It's a safe way to change events as it'll + // create any missing objects and properties you may require. + ctx.updateEvent('traits.custom', userReq) + user.traits(userReq) + + // Every plugin must return a `ctx` object, so that the event + // timeline can continue processing. + return ctx + }, + } + + return identity +} + ``` +You can view Segment's [existing plugins](https://github.com/segmentio/analytics-next/tree/master/src/plugins){:target="_blank"} to see more examples. + +### Register a plugin +Registering plugins enable you to modify your analytics implementation to best fit your needs. You can register a plugin using this: + +```js +// A promise will resolve once the plugins have been successfully loaded into Analytics.js +// Register multiple plugins at once by using the variable args interface in Analytics.js +await analytics.register(pluginA, pluginB, pluginC) +``` + +### Deregister a plugin +Deregister a plugin by using: + +```js +await analytics.dereigsrer("pluginNameA", "pluginNameB") // takes strings +``` ## Selecting Destinations @@ -417,9 +477,9 @@ analytics.track({ }) ``` -In this case, we're specifying that we want this `track` to only go to Vero. `All: false` says that no destination should be enabled unless otherwise specified. `Vero: true` turns on Vero, etc. +In this case, Segment specifies that they want this `track` to only go to Vero. `All: false` says that no destination should be enabled unless otherwise specified. `Vero: true` turns on Vero. -Destination flags are **case sensitive** and match [the destination's name in the docs](/docs/connections/destinations/) (i.e. "AdLearn Open Platform", "awe.sm", "MailChimp", etc.). In some cases, there may be several names for a destination; if that happens you'll see a "Adding (destination name) to the Integrations Object" section in the destination's doc page with a list of valid names. +Destination flags are **case sensitive** and match [the destination's name in the docs](/docs/connections/destinations/) (for example, "AdLearn Open Platform", "awe.sm", "MailChimp"). In some cases, there may be several names for a destination; if that happens you'll see a "Adding (destination name) to the Integrations Object" section in the destination's doc page with a list of valid names. **Note:** @@ -433,136 +493,51 @@ You can import historical data by adding the `timestamp` argument to any of your Historical imports can only be done into destinations that can accept historical timestamped data. Most analytics tools like Mixpanel, Amplitude, Kissmetrics, etc. can handle that type of data just fine. One common destination that does not accept historical data is Google Analytics since their API cannot accept historical data. -**Note:** If you're tracking things that are happening right now, leave out the `timestamp` and our servers will timestamp the requests for you. +**Note:** If you're tracking things that are happening right now, leave out the `timestamp` and Segment's servers will timestamp the requests for you. ## Batching -Our libraries are built to support high performance environments. That means it is safe to use our Node library on a web server that's serving hundreds of requests per second. +Segment's libraries are built to support high performance environments. That means it is safe to use Segment's Node library on a web server that's serving hundreds of requests per second. -Every method you call **does not** result in an HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation. +Every method you call **doesn't** result in a HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation. -By default, our library will flush: +By default, Segment's library will flush: - The very first time it gets a message. - - Every 20 messages (controlled by `options.flushAt`). - - If 10 seconds has passed since the last flush (controlled by `options.flushInterval`) + - Every 15 messages (controlled by `settings.maxEventsInBatch`). + - If 10 seconds has passed since the last flush (controlled by `settings.flushInterval`) There is a maximum of `500KB` per batch request and `32KB` per call. -If you don't want to batch messages, you can turn batching off by setting the `flushAt` option to `1`, like so: +If you don't want to batch messages, you can turn batching off by setting the `maxEventsInBatch` setting to `1`, like so: ```javascript -var analytics = new Analytics('YOUR_WRITE_KEY', { flushAt: 1 }); -``` +const analytics = new Analytics({ + ... + maxEventsInBatch: 1 +});``` -Batching means that your message might not get sent right away. But every method call takes an optional `callback`, which you can use to know when a particular message is flushed from the queue, like so: +Batching means that your message might not get sent right away. Every method call takes an optional `callback`, which you can use to know when a particular message is flushed from the queue, like so: ```javascript analytics.track({ - userId: '019mr8mf4r', - event: 'Ultimate Played' -}, function(err, batch){ - if (err) // There was an error flushing your message... - // Your message was successfully flushed! -}); -``` - -You can also flush on demand. For example, at the end of your program, you need to flush to make sure that nothing is left in the queue. To do that, call the `flush` method: - -```javascript -analytics.flush(function(err, batch){ - console.log('Flushed, and now this program can exit!'); -}); -``` - -## Long running process - -You should call `client.track(...)` and know that events will be queued and eventually sent to Segment. To prevent losing messages, be sure to capture any interruption (for example, a server restart) and call flush to know of and delay the process shutdown. - -```js -import { randomUUID } from 'crypto'; -import Analytics from 'analytics-node' - -const WRITE_KEY = '...'; - -const analytics = new Analytics(WRITE_KEY, { flushAt: 10 }); - -analytics.track({ - anonymousId: randomUUID(), - event: 'Test event', - properties: { - name: 'Test event', - timestamp: new Date() + userId: '019mr8mf4r', + event: 'Ultimate Played', + }, + (err, ctx) => { + ... } -}); - -const exitGracefully = async (code) => { - console.log('Flushing events'); - await analytics.flush(function(err, batch) { - console.log('Flushed, and now this program can exit!'); - process.exit(code); - }); -}; - -[ - 'beforeExit', 'uncaughtException', 'unhandledRejection', - 'SIGHUP', 'SIGINT', 'SIGQUIT', 'SIGILL', 'SIGTRAP', - 'SIGABRT','SIGBUS', 'SIGFPE', 'SIGUSR1', 'SIGSEGV', - 'SIGUSR2', 'SIGTERM', -].forEach(evt => process.on(evt, exitGracefully)); - -function logEvery2Seconds(i) { - setTimeout(() => { - console.log('Infinite Loop Test n:', i); - logEvery2Seconds(++i); - }, 2000); -} - -logEvery2Seconds(0); +) ``` -## Short lived process - -Short-lived functions have a predictably short and linear lifecycle, so use a queue big enough to hold all messages and then await flush to complete its work. - - -```js -import { randomUUID } from 'crypto'; -import Analytics from 'analytics-node' - - -async function lambda() -{ - const WRITE_KEY = '...'; - const analytics = new Analytics(WRITE_KEY, { flushAt: 20 }); - analytics.flushed = true; - - analytics.track({ - anonymousId: randomUUID(), - event: 'Test event', - properties: { - name: 'Test event', - timestamp: new Date() - } - }); - await analytics.flush(function(err, batch) { - console.log('Flushed, and now this program can exit!'); - }); -} - -lambda(); -``` - - ## Multiple Clients Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of `Analytics` with different settings: ```javascript -var Analytics = require('analytics-node'); -var marketingAnalytics = new Analytics('MARKETING_WRITE_KEY'); -var appAnalytics = new Analytics('APP_WRITE_KEY'); +const marketingAnalytics = new Analytics({ writeKey: 'MARKETING_WRITE_KEY' }); +const appAnalytics = new Analytics({ writeKey: 'APP_WRITE_KEY' }); ``` diff --git a/src/connections/sources/catalog/libraries/server/node/next.md b/src/connections/sources/catalog/libraries/server/node/next.md deleted file mode 100644 index 62b760a6d2..0000000000 --- a/src/connections/sources/catalog/libraries/server/node/next.md +++ /dev/null @@ -1,625 +0,0 @@ ---- -title: Analytics for Node.js 2.0 -repo: analytics-node -strat: node-js ---- - -Segment's Analytics Node.js 2.0 library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. - -The [Segment Analytics Node.js Next library is open-source](https://github.com/segmentio/analytics-next/tree/master/packages/node){:target="_blank"} on GitHub. - -All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to Segment's servers. - -> info "Using Analytics for Node.js Classic?" -> If you’re still using the classic version of Analytics for Node.js, you can refer to the documentation [here](/docs/connections/sources/catalog/libraries/server/node/). ->

On [date], Segment will end support for Analytics Node.js Classic, which includes versions [#] and older. Upgrade to Analytics Node.js 2.0. See the Analytics Node.js 2.0 docs to learn more. - -## Getting Started - -> warning "" -> Make sure you're using a version of Node that's 14 or higher. - -Run: - -```bash -# npm -npm install @segment/analytics-node -# yarn -yarn add @segment/analytics-node -# pnpm -pnpm install @segment/analytics-node -``` - -This will add Segment's Node library module to your `package.json`. The module exposes an `Analytics` constructor, which you need to initialize with your Segment source's **Write Key**, like so: - -```javascript -var Analytics = require('analytics-node'); -var analytics = new Analytics('YOUR_WRITE_KEY'); -``` - -Be sure to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can find in Segment under your source settings. - -This will create an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node/#development). - - -## Basic tracking methods -The basic tracking methods below serve as the building blocks of your Segment tracking. They include [Identify](#identify), [Track](#track), [Page](#page), [Group](#group), and [Alias](#alias). - -These methods correspond with those used in the [Segment Spec](/docs/connections/spec/). The documentation on this page explains how to use these methods in Analytics Node.js Next. - - -### Identify - -> info "Good to know" -> For any of the different methods described on this page, you can replace the properties and traits in the code samples with variables that represent the data collected. - -`identify` lets you tie a user to their actions and record traits about them. It includes a unique User ID and/or anonymous ID, and any optional traits you know about them. - -You should call `identify` once when the user's account is first created, and then again any time their traits change. - -Example of an anonymous `identify` call: - -```javascript -analytics.identify({ - anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe', - traits: { - friends: 42 - } -}); -``` - -This call identifies the user and records their unique anonymous ID, and labels them with the `friends` trait. - -Example of an `identify` call for an identified user: - -```javascript -analytics.identify({ - userId: '019mr8mf4r', - traits: { - name: 'Michael Bolton', - email: 'mbolton@example.com', - plan: 'Enterprise', - friends: 42 - } -}); -``` -The call above identifies Michael by his unique User ID (the one you know him by in your database), and labels him with the `name`, `email`, `plan` and `friends` traits. - -The `identify` call has the following fields: - - - - - - - - - - - - - - - - - - - - - - -
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any identify call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all identify calls._
`traits` _Object, optional_A dictionary of [traits](/docs/connections/spec/identify#traits) you know about the user. Things like: `email`, `name` or `friends`.
`timestamp` _Date, optional_A JavaScript date object representing when the identify took place. If the identify just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
- -Find details on the **identify method payload** in our [Spec](/docs/connections/spec/identify/). - -### Track - -`track` lets you record the actions your users perform. Every action triggers what we call an "event", which can also have associated properties. - -You'll want to track events that are indicators of success for your site, like **Signed Up**, **Item Purchased** or **Article Bookmarked**. - -To get started, we recommend tracking just a few important events. You can always add more later! - -Example anonymous `track` call: - -```javascript -analytics.track({ - anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe', - event: 'Item Purchased', - properties: { - revenue: 39.95, - shippingMethod: '2-day' - } -}); -``` - -Example identified `track` call: - -```javascript -analytics.track({ - userId: '019mr8mf4r', - event: 'Item Purchased', - properties: { - revenue: 39.95, - shippingMethod: '2-day' - } -}); -``` - -This example `track` call tells us that your user just triggered the **Item Purchased** event with a revenue of $39.95 and chose your hypothetical '2-day' shipping. - -`track` event properties can be anything you want to record. In this case, revenue and shipping method. - -The `track` call has the following fields: - - - - - - - - - - - - - - - - - - - - - - - - - - -
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any track call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all track calls._
`event` _String_The name of the event you're tracking. We recommend human-readable names like `Song Played` or `Status Updated`.
`properties` _Object, optional_A dictionary of properties for the event. If the event was `Product Added`, it might have properties like `price` or `product`.
`timestamp` _Date, optional_A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
- -Find details on **best practices in event naming** as well as the **`track` method payload** in our [Spec](/docs/connections/spec/track/). - -### Page - -The [`page`](/docs/connections/spec/page/) method lets you record page views on your website, along with optional extra information about the page being viewed. - -If you're using our client-side set up in combination with the Node.js library, page calls are **already tracked for you** by default. However, if you want to record your own page views manually and aren't using our client-side library, read on! - -Example `page` call: - -```js -analytics.page({ - userId: '019mr8mf4r', - category: 'Docs', - name: 'Node.js Library', - properties: { - url: 'https://segment.com/docs/connections/sources/catalog/librariesnode', - path: '/docs/connections/sources/catalog/librariesnode/', - title: 'Node.js Library - Segment', - referrer: 'https://github.com/segmentio/analytics-node' - } -}); -``` - -The `page` call has the following fields: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any page call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any page call._
`category` _String, optional_The category of the page. Useful for things like ecommerce where many pages often live under a larger category.
`name` _String, optional_The name of the page, for example **Signup** or **Home**.
`properties` _Object, optional_A dictionary of properties of the page. A few properties specially recognized and automatically translated: `url`, `title`, `referrer` and `path`, but you can add your own too!
`timestamp` _Date, optional_A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`.
`context` _Object, optional_A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._
- -Find details on the **`page` payload** in our [Spec](/docs/connections/spec/page/). - -### Group - -`group` lets you associate an [identified user](/docs/connections/sources/catalog/libraries/server/node/#identify) with a group. A group could be a company, organization, account, project or team! It also lets you record custom traits about the group, like industry or number of employees. - -This is useful for tools like [Intercom](/docs/connections/destinations/catalog/intercom/), [Preact](/docs/connections/destinations/catalog/preact/) and [Totango](/docs/connections/destinations/catalog/totango/), as it ties the user to a **group** of other users. - -Example `group` call: - -```javascript -analytics.group({ - userId: '019mr8mf4r', - groupId: '56', - traits: { - name: 'Initech', - description: 'Accounting Software' - } -}); -``` - -The `group` call has the following fields: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
`userId` _String, optional_The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any group call._
`anonymousId` _String, optional_An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any group call._
`groupId` _stringThe ID of the group.
`traits` _dict, optional_A dict of traits you know about the group. For a company, they might be things like `name`, `address`, or `phone`. [Learn more about traits](/docs/connections/spec/group/#traits).
`context` _dict, optional_A dict containing any context about the request. To see the full reference of supported keys, check them out in the [context reference](/docs/connections/spec/common/#context)
`timestamp` _datetime, optional_A `datetime` object representing when the group took place. If the group just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you send `timestamp`.
`integrations` _dict, optional_A dictionary of destinations to enable or disable
- -Find more details about `group`, including the **`group` payload**, in our [Spec](/docs/connections/spec/group/). - -### Alias - -The `alias` call allows you to associate one identity with another. This is an advanced method and should not be widely used, but is required to manage user identities in _some_ destinations. Other destinations do not support the alias call. - -In [Mixpanel](/docs/connections/destinations/catalog/mixpanel/#alias) it's used to associate an anonymous user with an identified user once they sign up. For [Kissmetrics](/docs/connections/destinations/catalog/kissmetrics/#alias), if your user switches IDs, you can use 'alias' to rename the 'userId'. - -Example `alias` call: - -```javascript -analytics.alias({ - previousId: 'old_id', - userId: 'new_id' -}); -``` - -The `alias` call has the following fields: - - - - - - - - - - -
`userId` _String_The ID for this user in your database.
`previousId` _String_The previous ID to alias from.
- -Here's a full example of how Segment might use the `alias` call: - -```javascript -// the anonymous user does actions ... -analytics.track({ userId: 'anonymous_user', event: 'Anonymous Event' }) -// the anonymous user signs up and is aliased -analytics.alias({ previousId: 'anonymous_user', userId: 'identified@example.com' }) -// the identified user is identified -analytics.identify({ userId: 'identified@example.com', traits: { plan: 'Free' } }) -// the identified user does actions ... -analytics.track({ userId: 'identified@example.com', event: 'Identified Action' }) -``` - -For more details about `alias`, including the **`alias` call payload**, check out the Segment [Spec](/docs/connections/spec/alias/). - ---- - - -## Configuration - -The second argument to the `Analytics` constructor is an optional list of settings to configure the module. - -```javascript -const analytics = new Analytics({ - writeKey: '', - host: 'https://api.segment.io', - path: '/v1/batch', - maxRetries: 3, - maxEventsInBatch: 15, - flushInterval: 10000, - // ... and more! - }) -``` - -Setting | Details -------- | -------- -`writeKey` _string_ | The key that corresponds to your Segment.io project -`host` _string_ | The base URL of the API. The default is: "https://api.segment.io" -`path` _string_ | The API path route. The default is: "/v1/batch" -`maxRetries` _number_ | The number of times to retry flushing a batch. The default is: `3` -`maxEventsInBatch` _number_ | The number of messages to enqueue before flushing. The default is: `15` -`flushInterval` _number_ | The number of milliseconds to wait before flushing the queue automatically. The default is: `10000` -`httpRequestTimeout` | The maximum number of milliseconds to wait for an http request. The default is: `10000` - -### Graceful shutdown -Avoid losing events after shutting down your console. Call `.closeAndFlush()` to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved. - -```javascript -await analytics.closeAndFlush() -// or -await analytics.closeAndFlush({ timeout: 5000 }) // force resolve after 5000ms -``` - -Here's an example of how to use graceful shutdown: -```javascript -const app = express() -const server = app.listen(3000) - -const onExit = async () => { - await analytics.closeAndFlush() - server.close(() => { - console.log("Gracefully closing server...") - process.exit() - }) -} -['SIGINT', 'SIGTERM'].forEach((code) => process.on(code, onExit)) -``` - -### Collect unflushed events -If you need to preserve all of your events in the instance of a forced timeout, even ones that came in after analytics.closeAndFlush() was called, you can still collect those events by using: - -```javascript -const unflushedEvents = [] - -analytics.on('call_after_close', (event) => unflushedEvents.push(events)) -await analytics.closeAndFlush() - -console.log(unflushedEvents) // all events that came in after closeAndFlush was called -``` - -## Regional configuration -For Business plans with access to [Regional Segment](/docs/guides/regional-segment), you can use the `host` configuration parameter to send data to the desired region: -1. Oregon (Default) — `api.segment.io/v1` -2. Dublin — `events.eu1.segmentapis.com` - -An example of setting the host to the EU endpoint using the Node library is: -```javascript -const analytics = new Analytics({ - ... - host: "https://events.eu1.segmentapis.com" -}); -``` - -## Error handling - -To keep track of errors, subscribe and log all event delivery errors by running: - -```javascript -const analytics = new Analytics({ writeKey: '' }) - -analytics.on('error', (err) => console.error(err)) -``` - - -### Event emitter interface -The event emitter interface allows you to track when certain things happen in the app, such as a track call or an error, and it will call the function you provided with some arguments when that event happens. - -```javascript -analytics.on('error', (err) => console.error(err)) - -analytics.on('identify', (ctx) => console.log(ctx)) - -analytics.on('track', (ctx) => console.log(ctx)) -``` - -## Plugin architecture -When you develop against Analytics 2.0, the plugins you write can augment functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done. - -Though middlewares function the same as plugins, it's best to use plugins as they are easier to implement and are more testable. - -### Plugin categories -Plugins are bound by Analytics 2.0 which handles operations such as observability, retries, and error handling. There are two different categories of plugins: -* **Critical Plugins**: Analytics.js expects this plugin to be loaded before starting event delivery. Failure to load a critical plugin halts event delivery. Use this category sparingly, and only for plugins that are critical to your tracking. -* **Non-critical Plugins**: Analytics.js can start event delivery before this plugin finishes loading. This means your plugin can fail to load independently from all other plugins. For example, every Analytics.js destination is a non-critical plugin. This makes it possible for Analytics.js to continue working if a partner destination fails to load, or if users have ad blockers turned on that are targeting specific destinations. - -> info "" -> Non-critical plugins are only non-critical from a loading standpoint. For example, if the `before` plugin crashes, this can still halt the event delivery pipeline. - -Non-critical plugins run through a timeline that executes in order of insertion based on the entry type. Segment has these five entry types of non-critical plugins: - -| Type | Details | -| ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | -| `before` | Executes before event processing begins. These are plugins that run before any other plugins run.

For example, validating events before passing them along to other plugins. A failure here could halt the event pipeline.

See the example of how Analytics.js uses the [Event Validation plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/validation/index.ts){:target="_blank"} to verify that every event has the correct shape. | -| `enrichment` | Executes as the first level of event processing. These plugins modify an event.

See the example of how Analytics.js uses the [Page Enrichment plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/page-enrichment/index.ts){:target="_blank"} to enrich every event with page information. | -| `destination` | Executes as events begin to pass off to destinations.

This doesn't modify the event outside of the specific destination, and failure doesn't halt the execution. | -| `after` | Executes after all event processing completes. You can use this to perform cleanup operations.

An example of this is the [Segment.io Plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/segmentio/index.ts){:target="_blank"} which waits for destinations to succeed or fail so it can send it observability metrics. | -| `utility` | Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality. | - -### Example plugins -Here's an example of a plugin that converts all track event names to lowercase before the event goes through the rest of the pipeline: - -```js -export const lowercase: Plugin = { - name: 'Lowercase events', - type: 'enrichment', - version: '1.0.0', - - isLoaded: () => true, - load: () => Promise.resolve(), - - track: (ctx) => { - ctx.updateEvent('event', ctx.event.event.toLowerCase()) - return ctx - } -} - -const identityStitching = () => { - let user - - const identity = { - // Identifies your plugin in the Plugins stack. - // Access `window.analytics.queue.plugins` to see the full list of plugins - name: 'Identity Stitching', - // Defines where in the event timeline a plugin should run - type: 'enrichment', - version: '0.1.0', - - // Used to signal that a plugin has been property loaded - isLoaded: () => user !== undefined, - - // Applies the plugin code to every `identify` call in Analytics.js - // You can override any of the existing types in the Segment Spec. - async identify(ctx) { - // Request some extra info to enrich your `identify` events from - // an external API. - const req = await fetch( - `https://jsonplaceholder.typicode.com/users/${ctx.event.userId}` - ) - const userReq = await req.json() - - // ctx.updateEvent can be used to update deeply nested properties - // in your events. It's a safe way to change events as it'll - // create any missing objects and properties you may require. - ctx.updateEvent('traits.custom', userReq) - user.traits(userReq) - - // Every plugin must return a `ctx` object, so that the event - // timeline can continue processing. - return ctx - }, - } - - return identity -} - -``` - -You can view Segment's [existing plugins](https://github.com/segmentio/analytics-next/tree/master/src/plugins){:target="_blank"} to see more examples. - -### Register a plugin -Registering plugins enable you to modify your analytics implementation to best fit your needs. You can register a plugin using this: - -```js -// A promise will resolve once the plugins have been successfully loaded into Analytics.js -// You can register multiple plugins at once by using the variable args interface in Analytics.js -await window.analytics.register(pluginA, pluginB, pluginN) -``` - -## Selecting Destinations - -The `alias`, `group`, `identify`, `page` and `track` calls can all be passed an object of `integrations` that lets you turn certain destinations on or off. By default all destinations are enabled. - -Here's an example with the `integrations` object shown: - -```javascript -analytics.track({ - event: 'Membership Upgraded', - userId: '97234974', - integrations: { - 'All': false, - 'Vero': true, - 'Google Analytics': false - } -}) -``` - -In this case, Segment specifies that they want this `track` to only go to Vero. `All: false` says that no destination should be enabled unless otherwise specified. `Vero: true` turns on Vero. - -Destination flags are **case sensitive** and match [the destination's name in the docs](/docs/connections/destinations/) (for example, "AdLearn Open Platform", "awe.sm", "MailChimp"). In some cases, there may be several names for a destination; if that happens you'll see a "Adding (destination name) to the Integrations Object" section in the destination's doc page with a list of valid names. - -**Note:** - -- Available at the business level, filtering track calls can be done right from the Segment UI on your source schema page. We recommend using the UI if possible since it's a much simpler way of managing your filters and can be updated with no code changes on your side. - -- If you are on a grandfathered plan, events sent server-side that are filtered through the Segment dashboard will still count towards your API usage. - -## Historical Import - -You can import historical data by adding the `timestamp` argument to any of your method calls. This can be helpful if you've just switched to Segment. - -Historical imports can only be done into destinations that can accept historical timestamped data. Most analytics tools like Mixpanel, Amplitude, Kissmetrics, etc. can handle that type of data just fine. One common destination that does not accept historical data is Google Analytics since their API cannot accept historical data. - -**Note:** If you're tracking things that are happening right now, leave out the `timestamp` and Segment's servers will timestamp the requests for you. - - -## Batching - -Segment's libraries are built to support high performance environments. That means it is safe to use Segment's Node library on a web server that's serving hundreds of requests per second. - -Every method you call **doesn't** result in a HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation. - -By default, Segment's library will flush: - - - The very first time it gets a message. - - Every 15 messages (controlled by `settings.maxEventsInBatch`). - - If 10 seconds has passed since the last flush (controlled by `settings.flushInterval`) - -There is a maximum of `500KB` per batch request and `32KB` per call. - -If you don't want to batch messages, you can turn batching off by setting the `maxEventsInBatch` setting to `1`, like so: - -```javascript -const analytics = new Analytics({ - ... - maxEventsInBatch: 1 -});``` - -Batching means that your message might not get sent right away. Every method call takes an optional `callback`, which you can use to know when a particular message is flushed from the queue, like so: - -```javascript -analytics.track({ - userId: '019mr8mf4r', - event: 'Ultimate Played', - }, - (err, ctx) => { - ... - } -) -``` - -You can also flush on demand. For example, at the end of your program, you need to flush to make sure that nothing is left in the queue. To do that, call the `flush` method: - -```javascript -analytics.flush(function(err, batch){ - console.log('Flushed, and now this program can exit!'); -}); -``` - -## Multiple Clients - -Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of `Analytics` with different settings: - -```javascript -const marketingAnalytics = new Analytics({ writeKey: 'MARKETING_WRITE_KEY' }); -const appAnalytics = new Analytics({ writeKey: 'APP_WRITE_KEY' }); -``` - - -## Troubleshooting - -{% include content/troubleshooting-intro.md %} -{% include content/troubleshooting-server-debugger.md %} -{% include content/troubleshooting-server-integration.md %} From ce4c6f7eb8c0cb99ec0f14c0883440618692264a Mon Sep 17 00:00:00 2001 From: stayseesong Date: Mon, 9 Jan 2023 10:54:12 -0800 Subject: [PATCH 06/15] edits --- .../catalog/libraries/server/node/classic.md | 44 +++++++++---------- .../catalog/libraries/server/node/index.md | 2 +- .../libraries/server/node/quickstart.md | 23 +++++++--- 3 files changed, 39 insertions(+), 30 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/classic.md b/src/connections/sources/catalog/libraries/server/node/classic.md index bbe4075540..4c63764887 100644 --- a/src/connections/sources/catalog/libraries/server/node/classic.md +++ b/src/connections/sources/catalog/libraries/server/node/classic.md @@ -6,13 +6,13 @@ hidden: true --- > warning "Deprecation of Analytics Node.js Classic" -> On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to the new version of Analytics Node.js. See the updated [Analytics Node.js] docs to learn more. +> On April 1, 2023, Segment will end support for Analytics Node.js Classic, which includes versions 6.2.0 and older. Upgrade to the new version of Analytics Node.js. See the updated [Analytics Node.js docs](/docs/connections/sources/catalog/libraries/server/node) to learn more. Segment's Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. The [Segment Node.js library is open-source](https://github.com/segmentio/analytics-node) on GitHub. -All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to our servers. +All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to Segment's servers. Want to stay updated on releases? Subscribe to the [release feed](https://github.com/segmentio/analytics-node/releases.atom). @@ -24,7 +24,7 @@ Run: npm install --save analytics-node ``` -This will add our Node library module to your `package.json`. The module exposes an `Analytics` constructor, which you need to initialize with your Segment source's **Write Key**, like so: +This will add Segment's Node library module to your `package.json`. The module exposes an `Analytics` constructor, which you need to initialize with your Segment source's **Write Key**, like so: ```javascript var Analytics = require('analytics-node'); @@ -91,18 +91,18 @@ Field | Details `userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any identify call._ `anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all identify calls._ `traits` _Object, optional_ | A dictionary of [traits](/docs/connections/spec/identify#traits) you know about the user. Things like: `email`, `name` or `friends`. -`timestamp` _Date, optional_ | A JavaScript date object representing when the identify took place. If the identify just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`timestamp` _Date, optional_ | A JavaScript date object representing when the identify call took place. If the identify just happened, leave it out as Segment will use the server's time. If you're importing data from the past make sure you to send a `timestamp`. `context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ Find details on the **identify method payload** in the Segment [Spec](/docs/connections/spec/identify/). ## Track -`track` lets you record the actions your users perform. Every action triggers what we call an "event", which can also have associated properties. +`track` lets you record the actions your users perform. Every action triggers what Segment calls an "event", which can also have associated properties. You'll want to track events that are indicators of success for your site, like **Signed Up**, **Item Purchased** or **Article Bookmarked**. -To get started, we recommend tracking just a few important events. You can always add more later! +To get started, Segment recommends tracking just a few important events. You can always add more later. Example anonymous `track` call: @@ -130,7 +130,7 @@ analytics.track({ }); ``` -This example `track` call tells us that your user just triggered the **Item Purchased** event with a revenue of $39.95 and chose your hypothetical '2-day' shipping. +This example `track` call tells that your user just triggered the **Item Purchased** event with a revenue of $39.95 and chose your hypothetical '2-day' shipping. `track` event properties can be anything you want to record. In this case, revenue and shipping method. @@ -140,9 +140,9 @@ Field | Details ----- | -------- `userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any track call. `anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all track calls._ -`event` _String_ | The name of the event you're tracking. We recommend human-readable names like `Song Played` or `Status Updated`. +`event` _String_ | The name of the event you're tracking. Segment recommends you use human-readable names like `Song Played` or `Status Updated`. `properties` _Object, optional_ | A dictionary of properties for the event. If the event was `Product Added`, it might have properties like `price` or `product`. -`timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out as Segment will use the server's time. If you're importing data from the past make sure you to send a `timestamp`. `context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ Find details on **best practices in event naming** as well as the **`track` method payload** in the Segment [Spec](/docs/connections/spec/track/). @@ -151,7 +151,7 @@ Find details on **best practices in event naming** as well as the **`track` meth The [`page`](/docs/connections/spec/page/) method lets you record page views on your website, along with optional extra information about the page being viewed. -If you're using our client-side set up in combination with the Node.js library, page calls are **already tracked for you** by default. However, if you want to record your own page views manually and aren't using our client-side library, read on! +If you're using Segment's client-side set up in combination with the Node.js library, page calls are already tracked for you by default. Example `page` call: @@ -174,11 +174,11 @@ The `page` call has the following fields: Field | Details ----- | -------- `userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any page call. -`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any page call._ +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any page call._ `category` _String, optional_ | The category of the page. Useful for things like ecommerce where many pages often live under a larger category. `name` _String, optional_ | The name of the page, for example **Signup** or **Home**. `properties` _Object, optional_ | A dictionary of properties of the page. A few properties specially recognized and automatically translated: `url`, `title`, `referrer` and `path`, but you can add your own too. -`timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out as Segment will use the server's time. If you're importing data from the past make sure you to send a `timestamp`. `context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ Find details on the **`page` payload** in the Segment [Spec](/docs/connections/spec/page/). @@ -207,11 +207,11 @@ The `group` call has the following fields: Field | Details ----- | -------- `userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any group call. -`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (eg., [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any group call._ +`anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: at least one of `userId` or `anonymousId` must be included in any group call._ `groupId` _string | The ID of the group. `traits` _dict, optional_ | A dict of traits you know about the group. For a company, they might be things like `name`, `address`, or `phone`. [Learn more about traits](/docs/connections/spec/group/#traits). `context` _dict, optional_ | A dict containing any context about the request. To see the full reference of supported keys, check them out in the [context reference](/docs/connections/spec/common/#context) -`timestamp` _datetime, optional_ | A `datetime` object representing when the group took place. If the group just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you send `timestamp`. +`timestamp` _datetime, optional_ | A `datetime` object representing when the group took place. If the group just happened, leave it out as Segment uses the server's time. If you're importing data from the past make sure you send `timestamp`. `integrations` _dict, optional_ | A dictionary of destinations to enable or disable. Find more details about `group`, including the **`group` payload**, in the Segment [Spec](/docs/connections/spec/group/). @@ -238,7 +238,7 @@ Field | Details `userId` _String_ | The ID for this user in your database. `previousId` _String_ | The previous ID to alias from. -Here's a full example of how we might use the `alias` call: +Here's a full example of how Segment might use the `alias` call: ```javascript // the anonymous user does actions ... @@ -251,7 +251,7 @@ analytics.identify({ userId: 'identified@example.com', traits: { plan: 'Free' } analytics.track({ userId: 'identified@example.com', event: 'Identified Action' }) ``` -For more details about `alias`, including the **`alias` call payload**, check out our [Spec](/docs/connections/spec/alias/). +For more details about `alias`, including the **`alias` call payload**, check out the [Spec](/docs/connections/spec/alias/). --- @@ -327,13 +327,13 @@ analytics.track({ }) ``` -In this case, we're specifying that we want this `track` to only go to Vero. `All: false` says that no destination should be enabled unless otherwise specified. `Vero: true` turns on Vero, etc. +In this case, Segment specifies the `track` to only go to Vero. `All: false` says that no destination should be enabled unless otherwise specified. `Vero: true` turns on Vero, etc. -Destination flags are **case sensitive** and match [the destination's name in the docs](/docs/connections/destinations/) (i.e. "AdLearn Open Platform", "awe.sm", "MailChimp", etc.). In some cases, there may be several names for a destination; if that happens you'll see a "Adding (destination name) to the Integrations Object" section in the destination's doc page with a list of valid names. +Destination flags are **case sensitive** and match [the destination's name in the docs](/docs/connections/destinations/) (for example, "AdLearn Open Platform", "awe.sm", "MailChimp"). In some cases, there may be several names for a destination; if that happens you'll see a "Adding (destination name) to the Integrations Object" section in the destination's doc page with a list of valid names. **Note:** -- Available at the business level, filtering track calls can be done right from the Segment UI on your source schema page. We recommend using the UI if possible since it's a much simpler way of managing your filters and can be updated with no code changes on your side. +- Available at the business level, filtering track calls can be done right from the Segment UI on your source schema page. Segment recommends using the UI if possible since it's a much simpler way of managing your filters and can be updated with no code changes on your side. - If you are on a grandfathered plan, events sent server-side that are filtered through the Segment dashboard will still count towards your API usage. @@ -343,16 +343,16 @@ You can import historical data by adding the `timestamp` argument to any of your Historical imports can only be done into destinations that can accept historical timestamped data. Most analytics tools like Mixpanel, Amplitude, Kissmetrics, etc. can handle that type of data just fine. One common destination that does not accept historical data is Google Analytics since their API cannot accept historical data. -**Note:** If you're tracking things that are happening right now, leave out the `timestamp` and our servers will timestamp the requests for you. +**Note:** If you're tracking things that are happening right now, leave out the `timestamp` and Segment's servers will timestamp the requests for you. ## Batching -Our libraries are built to support high performance environments. That means it is safe to use our Node library on a web server that's serving hundreds of requests per second. +Segment's libraries are built to support high performance environments. That means it is safe to use the Segment Node library on a web server that's serving hundreds of requests per second. Every method you call **does not** result in an HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation. -By default, our library will flush: +By default, Segment's library flushes: - The very first time it gets a message. - Every 20 messages (controlled by `options.flushAt`). diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index ff4141c812..cb78f333f5 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -13,7 +13,7 @@ All of Segment's server-side libraries are built for high-performance, so you ca > info "Using Analytics for Node.js Classic?" > If you’re still using the classic version of Analytics for Node.js, you can refer to the documentation [here](/docs/connections/sources/catalog/libraries/server/node/classic). ->

On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to Analytics Node.js 2.0. See the Analytics Node.js 2.0 docs to learn more. +>

On April 1, 2023, Segment will end support for Analytics Node.js Classic, which includes versions 6.2.0 and older. Upgrade to new Analytics Node.js. See the updated [Analytics Node.js quickstart guide](/docs/connections/sources/catalog/libraries/server/node/quickstart/) to learn more. ## Getting Started diff --git a/src/connections/sources/catalog/libraries/server/node/quickstart.md b/src/connections/sources/catalog/libraries/server/node/quickstart.md index 6064c5a537..a2aafff37b 100644 --- a/src/connections/sources/catalog/libraries/server/node/quickstart.md +++ b/src/connections/sources/catalog/libraries/server/node/quickstart.md @@ -1,11 +1,11 @@ --- -title: 'Quickstart: Node.js Classic' +title: 'Quickstart: Node.js' redirect_from: '/connections/sources/catalog/libraries/server/node-js/quickstart/' strat: node-js --- > warning "Deprecation of Analytics Node.js Classic" -> On [date], Segment will end support for Analytics Node.js Classic, which includes versions [1.x.x] and older. Upgrade to Analytics Node.js 2.0. See the [Analytics Node.js 2.0] docs to learn more. +> On [date], Segment will end support for [Analytics Node.js Classic](/docs/connections/sources/catalog/libraries/server/node/classic/), which includes versions [6.2.0] and older. Upgrade to Analytics Node.js 2.0. See the [Analytics Node.js 2.0] docs to learn more. This tutorial will help you start sending data from your Node servers to Segment and any destination, using Segment's Node library. Check out the full documentation for [Analytics Node.js](/docs/connections/sources/catalog/libraries/server/node) to learn more. @@ -17,19 +17,28 @@ To get started with Analytics Node.js: 4. Give the source a display name, and enter the URL the source will collect data from. * When you create a Source in the Segment web app, it tells the Segment servers that you'll be sending data from a specific source type. When you create or change a Source in the Segment app, Segment generates a new Write Key for that source. You use the write key in your code to tell the Segment servers where the data is coming from, so Segment can route it to your destinations and other tools. 2. Install the module. - 1. Run the following npm command to install Segment: + 1. Run the following commands to install Segment: ``` - npm install --save analytics-node + # npm + npm install @segment/analytics-node + # yarn + yarn add @segment/analytics-node + # pnpm + pnpm install @segment/analytics-node ``` This will add the Node library module to your `package.json`. The module exposes an `Analytics` constructor, which you need to initialize with your Segment project's **Write Key**, like so: ```javascript - var Analytics = require('analytics-node'); - var analytics = new Analytics('YOUR_WRITE_KEY'); + import { Analytics } from '@segment/analytics-node' + // or, if you use require: + const { Analytics } = require('@segment/analytics-node') + + // instantiation + const analytics = new Analytics({ writeKey: '' }) ``` - This will create an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node#development). + This creates an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node#development). 3. Identify Users. * **Note:** For any of the different methods described in this quickstart, you can replace the properties and traits in the code samples with variables that represent the data collected. From 5b7df8d6c98ee639680902869f85ab3d6d34501c Mon Sep 17 00:00:00 2001 From: stayseesong Date: Mon, 9 Jan 2023 11:20:00 -0800 Subject: [PATCH 07/15] [netlify-build] --- .../sources/catalog/libraries/server/node/index.md | 8 ++++---- .../sources/catalog/libraries/server/node/quickstart.md | 2 +- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index cb78f333f5..85d65f2f17 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -1,11 +1,11 @@ --- title: Analytics for Node.js redirect_from: '/connections/sources/catalog/libraries/server/node-js/' -repo: analytics-node +repo: analytics-next strat: node-js --- -Segment's Analytics Node.js 2.0 library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. +Segment's Analytics Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. The [Segment Analytics Node.js Next library is open-source](https://github.com/segmentio/analytics-next/tree/master/packages/node){:target="_blank"} on GitHub. @@ -42,7 +42,7 @@ All of Segment's server-side libraries are built for high-performance, so you ca const analytics = new Analytics({ writeKey: '' }) ``` - Be sure to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can find in Segment under your source settings. + Be sure to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can find in Segment by navigating to: **Connections > Sources** and selecting your source and going to the **Settings** tab. This creates an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node/#development). @@ -288,7 +288,7 @@ Setting | Details `flushInterval` _number_ | The number of milliseconds to wait before flushing the queue automatically. The default is: `10000` `httpRequestTimeout` | The maximum number of milliseconds to wait for an http request. The default is: `10000` -### Graceful shutdown +## Graceful shutdown Avoid losing events after shutting down your console. Call `.closeAndFlush()` to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved. ```javascript diff --git a/src/connections/sources/catalog/libraries/server/node/quickstart.md b/src/connections/sources/catalog/libraries/server/node/quickstart.md index a2aafff37b..4f6f074da3 100644 --- a/src/connections/sources/catalog/libraries/server/node/quickstart.md +++ b/src/connections/sources/catalog/libraries/server/node/quickstart.md @@ -5,7 +5,7 @@ strat: node-js --- > warning "Deprecation of Analytics Node.js Classic" -> On [date], Segment will end support for [Analytics Node.js Classic](/docs/connections/sources/catalog/libraries/server/node/classic/), which includes versions [6.2.0] and older. Upgrade to Analytics Node.js 2.0. See the [Analytics Node.js 2.0] docs to learn more. +> On April 1, 2023, Segment will end support for [Analytics Node.js Classic](/docs/connections/sources/catalog/libraries/server/node/classic/), which includes versions 6.2.0 and older. Upgrade to new Analytics Node.js. See the [updated Analytics Node.js docs](/docs/connections/sources/catalog/libraries/server/node/) to learn more. This tutorial will help you start sending data from your Node servers to Segment and any destination, using Segment's Node library. Check out the full documentation for [Analytics Node.js](/docs/connections/sources/catalog/libraries/server/node) to learn more. From b5d18b518af69f7cb6e48e7b215cca3b036811b6 Mon Sep 17 00:00:00 2001 From: stayseesong Date: Wed, 11 Jan 2023 14:55:42 -0800 Subject: [PATCH 08/15] edits [netlify-build] --- .../sources/catalog/libraries/server/node/classic.md | 4 ++-- .../sources/catalog/libraries/server/node/index.md | 11 +++++------ .../catalog/libraries/server/node/quickstart.md | 5 +++-- 3 files changed, 10 insertions(+), 10 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/classic.md b/src/connections/sources/catalog/libraries/server/node/classic.md index 4c63764887..13a4f57c1f 100644 --- a/src/connections/sources/catalog/libraries/server/node/classic.md +++ b/src/connections/sources/catalog/libraries/server/node/classic.md @@ -5,8 +5,8 @@ strat: node-js hidden: true --- -> warning "Deprecation of Analytics Node.js Classic" -> On April 1, 2023, Segment will end support for Analytics Node.js Classic, which includes versions 6.2.0 and older. Upgrade to the new version of Analytics Node.js. See the updated [Analytics Node.js docs](/docs/connections/sources/catalog/libraries/server/node) to learn more. +> info "Upgrade to the new version of Analytics Node.js" +> Upgrade to the new version of Analytics Node.js. See the updated [Analytics Node.js docs](/docs/connections/sources/catalog/libraries/server/node) to learn more. Segment's Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index 85d65f2f17..523b89d0dc 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -5,16 +5,15 @@ repo: analytics-next strat: node-js --- +> info "" +> This version of Analytics for Node.js is in beta and Segment is actively working on this feature. Segment's [First-Access and Beta terms](https://segment.com/legal/first-access-beta-preview/) govern this feature. If you’re using the classic version of Analytics for Node.js, you can refer to the documentation [here](/docs/connections/sources/catalog/libraries/server/node/classic). + Segment's Analytics Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. The [Segment Analytics Node.js Next library is open-source](https://github.com/segmentio/analytics-next/tree/master/packages/node){:target="_blank"} on GitHub. All of Segment's server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make `identify` and `track` calls non-blocking and fast. It also batches messages and flushes asynchronously to Segment's servers. -> info "Using Analytics for Node.js Classic?" -> If you’re still using the classic version of Analytics for Node.js, you can refer to the documentation [here](/docs/connections/sources/catalog/libraries/server/node/classic). ->

On April 1, 2023, Segment will end support for Analytics Node.js Classic, which includes versions 6.2.0 and older. Upgrade to new Analytics Node.js. See the updated [Analytics Node.js quickstart guide](/docs/connections/sources/catalog/libraries/server/node/quickstart/) to learn more. - ## Getting Started > warning "" @@ -504,7 +503,6 @@ Every method you call **doesn't** result in a HTTP request, but is queued in mem By default, Segment's library will flush: - - The very first time it gets a message. - Every 15 messages (controlled by `settings.maxEventsInBatch`). - If 10 seconds has passed since the last flush (controlled by `settings.flushInterval`) @@ -516,7 +514,8 @@ If you don't want to batch messages, you can turn batching off by setting the `m const analytics = new Analytics({ ... maxEventsInBatch: 1 -});``` +}); +``` Batching means that your message might not get sent right away. Every method call takes an optional `callback`, which you can use to know when a particular message is flushed from the queue, like so: diff --git a/src/connections/sources/catalog/libraries/server/node/quickstart.md b/src/connections/sources/catalog/libraries/server/node/quickstart.md index 4f6f074da3..8d9fb8ef7e 100644 --- a/src/connections/sources/catalog/libraries/server/node/quickstart.md +++ b/src/connections/sources/catalog/libraries/server/node/quickstart.md @@ -4,8 +4,9 @@ redirect_from: '/connections/sources/catalog/libraries/server/node-js/quickstart strat: node-js --- -> warning "Deprecation of Analytics Node.js Classic" -> On April 1, 2023, Segment will end support for [Analytics Node.js Classic](/docs/connections/sources/catalog/libraries/server/node/classic/), which includes versions 6.2.0 and older. Upgrade to new Analytics Node.js. See the [updated Analytics Node.js docs](/docs/connections/sources/catalog/libraries/server/node/) to learn more. +> info "" +> This version of Analytics for Node.js is in beta and Segment is actively working on this feature. Segment's [First-Access and Beta terms](https://segment.com/legal/first-access-beta-preview/) govern this feature. +>

If you’re using the [classic version of Analytics for Node.js](/docs/connections/sources/catalog/libraries/server/node/classic/), consider upgrading to the [latest Analytics for Node.js library](/docs/connections/sources/catalog/libraries/server/node). This tutorial will help you start sending data from your Node servers to Segment and any destination, using Segment's Node library. Check out the full documentation for [Analytics Node.js](/docs/connections/sources/catalog/libraries/server/node) to learn more. From 7edc2f3857f413c26080d091c0fc0758442826b8 Mon Sep 17 00:00:00 2001 From: stayseesong Date: Thu, 12 Jan 2023 17:29:54 -0800 Subject: [PATCH 09/15] edits [netlify-build] --- src/_data/sidenav/strat.yml | 4 + .../catalog/libraries/server/node/index.md | 12 +-- .../libraries/server/node/migration.md | 90 +++++++++++++++++++ 3 files changed, 100 insertions(+), 6 deletions(-) create mode 100644 src/connections/sources/catalog/libraries/server/node/migration.md diff --git a/src/_data/sidenav/strat.yml b/src/_data/sidenav/strat.yml index 6e1394a567..70b8f9df36 100644 --- a/src/_data/sidenav/strat.yml +++ b/src/_data/sidenav/strat.yml @@ -197,3 +197,7 @@ sections: title: Analytics-Node.js - path: /connections/sources/catalog/libraries/server/node/quickstart/ title: Quickstart Node.js + - path: /connections/sources/catalog/libraries/server/node/classic/ + title: Analytics-Node.js Classic + - path: /connections/sources/catalog/libraries/server/node/migration/ + title: Analytics-Node.js Migration Guide diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index 523b89d0dc..11515a57b5 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -19,7 +19,7 @@ All of Segment's server-side libraries are built for high-performance, so you ca > warning "" > Make sure you're using a version of Node that's 14 or higher. -1. Run the following to add Segment's Node library module to your `package.json`. +1. Run the relevant command to add Segment's Node library module to your `package.json`. ```bash # npm @@ -43,7 +43,7 @@ All of Segment's server-side libraries are built for high-performance, so you ca Be sure to replace `YOUR_WRITE_KEY` with your actual **Write Key** which you can find in Segment by navigating to: **Connections > Sources** and selecting your source and going to the **Settings** tab. - This creates an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. In development you might want to use [development settings](/docs/connections/sources/catalog/libraries/server/node/#development). + This creates an instance of `Analytics` that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests. ## Basic tracking methods @@ -96,7 +96,7 @@ Field | Details `userId` _String, optional_ | The ID for this user in your database. _Note: at least one of `userId` or `anonymousId` must be included in any identify call._ `anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all identify calls._ `traits` _Object, optional_ | A dictionary of [traits](/docs/connections/spec/identify#traits) you know about the user. Things like: `email`, `name` or `friends`. -`timestamp` _Date, optional_ | A JavaScript date object representing when the identify took place. If the identify just happened, leave it out and we'll use the server's time. If you're importing data from the past make sure you to send a `timestamp`. +`timestamp` _Date, optional_ | A JavaScript date object representing when the identify took place. If the identify just happened, leave it out as Segment uses the server's time. If you're importing data from the past make sure to send a `timestamp`. `context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ Find details on the **identify method payload** in Segment's [Spec](/docs/connections/spec/identify/). @@ -359,12 +359,12 @@ analytics.on('track', (ctx) => console.log(ctx)) ``` ## Plugin architecture -When you develop against Analytics 2.0, the plugins you write can augment functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done. +When you develop against [Analytics.js 2.0](/docs/connections/sources/catalog/libraries/website/javascript/), the plugins you write can augment functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done. Though middlewares function the same as plugins, it's best to use plugins as they are easier to implement and are more testable. ### Plugin categories -Plugins are bound by Analytics 2.0 which handles operations such as observability, retries, and error handling. There are two different categories of plugins: +Plugins are bound by Analytics.js 2.0 which handles operations such as observability, retries, and error handling. There are two different categories of plugins: * **Critical Plugins**: Analytics.js expects this plugin to be loaded before starting event delivery. Failure to load a critical plugin halts event delivery. Use this category sparingly, and only for plugins that are critical to your tracking. * **Non-critical Plugins**: Analytics.js can start event delivery before this plugin finishes loading. This means your plugin can fail to load independently from all other plugins. For example, every Analytics.js destination is a non-critical plugin. This makes it possible for Analytics.js to continue working if a partner destination fails to load, or if users have ad blockers turned on that are targeting specific destinations. @@ -455,7 +455,7 @@ await analytics.register(pluginA, pluginB, pluginC) Deregister a plugin by using: ```js -await analytics.dereigsrer("pluginNameA", "pluginNameB") // takes strings +await analytics.deregister("pluginNameA", "pluginNameB") // takes strings ``` ## Selecting Destinations diff --git a/src/connections/sources/catalog/libraries/server/node/migration.md b/src/connections/sources/catalog/libraries/server/node/migration.md new file mode 100644 index 0000000000..ac5d055220 --- /dev/null +++ b/src/connections/sources/catalog/libraries/server/node/migration.md @@ -0,0 +1,90 @@ +--- +title: Analytics for Node.js Migration Guide +repo: analytics-next +strat: node-js +--- + +> info "" +> This version of Analytics for Node.js is in beta and Segment is actively working on this feature. Segment's [First-Access and Beta terms](https://segment.com/legal/first-access-beta-preview/) govern this feature. + +If you're using the [classic version of Analytics Node.js](/docs/connections/sources/catalog/libraries/server/node/classic), follow these steps to upgrade to the [latest version of Analytics Node.js](/connections/sources/catalog/libraries/server/node/). + +1. Change the named imports. + +
Before: + ```javascript + import Analytics from 'analytics-node' + ``` + + After: + ```javascript + import { Analytics } from '@segment/analytics-node' + ``` +2. Change instantiation to have an object as the first argument. + +
Before: + ```javascript + var analytics = new Analytics('YOUR_WRITE_KEY'); + ``` + + After: + ```javascript + const analytics = new Analytics({ writeKey: '' }) + ``` +3. Change flushing to [graceful shutdown](/docs/connections/sources/catalog/libraries/server/node//#graceful-shutdown). + +
Before: + ```javascript + await analytics.flush(function(err, batch) { + console.log('Flushed, and now this program can exit!'); + }); + ``` + + After: + ```javascript + await analytics.closeAndFlush() + ``` + +### Differences to note between the classic and updated version + +* The callback call signature changed. + +
Before: + ```javascript + (err, batch) => void + ``` + + After: + ```javascript + (err, ctx) => void + ``` +* The `flushAt` configuration option changed to `maxEventsInBatch`. + +#### Removals +The updated Analytics Node.js removed these configuration options: +- `enable` +- `errorHandler` (see the docs on [error handling](/docs/connections/sources/catalog/libraries/server/node//#error-handling) for more information) + +The updated Analytics Node.js library removed undocumented behavior around `track` properties + +Before: + +```javascript +analytics.track({ + ... + event: 'Ultimate Played', + myProp: 'abc' +}) +``` + +After: + +```javascript +analytics.track({ + ... + event: 'Ultimate Played', + properties: { + myProp: 'abc' + } +}) +``` \ No newline at end of file From e3527ec6ee3a921ccd15e5d8b566208cf2f59f40 Mon Sep 17 00:00:00 2001 From: stayseesong Date: Fri, 13 Jan 2023 14:16:21 -0800 Subject: [PATCH 10/15] edits --- .../catalog/libraries/server/node/index.md | 20 ++++++++++++++++++- .../libraries/server/node/migration.md | 2 +- 2 files changed, 20 insertions(+), 2 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index 11515a57b5..38f1a77f5d 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -348,7 +348,7 @@ analytics.on('error', (err) => console.error(err)) ### Event emitter interface -The event emitter interface allows you to track when certain things happen in the app, such as a track call or an error, and it will call the function you provided with some arguments when that event happens. +The event emitter interface allows you to track events, such as `track` and `identify` calls, and it calls the function you provided with some arguments upon successful delivery. `error` emits on delivery error. See the complete list of emitted events in the [GitHub Node repository](https://github.com/segmentio/analytics-next/blob/master/packages/node/src/app/emitter.ts). ```javascript analytics.on('error', (err) => console.error(err)) @@ -358,6 +358,24 @@ analytics.on('identify', (ctx) => console.log(ctx)) analytics.on('track', (ctx) => console.log(ctx)) ``` +Use the emitter to log all HTTP Requests. + + ```javascript + analytics.on('http_request', (event) => console.log(event)) + + // when triggered, emits an event of the shape: + { + url: 'https://api.segment.io/v1/batch', + method: 'POST', + headers: { + 'Content-Type': 'application/json', + ... + }, + body: '...', + } + ``` + + ## Plugin architecture When you develop against [Analytics.js 2.0](/docs/connections/sources/catalog/libraries/website/javascript/), the plugins you write can augment functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done. diff --git a/src/connections/sources/catalog/libraries/server/node/migration.md b/src/connections/sources/catalog/libraries/server/node/migration.md index ac5d055220..b80245bd44 100644 --- a/src/connections/sources/catalog/libraries/server/node/migration.md +++ b/src/connections/sources/catalog/libraries/server/node/migration.md @@ -29,7 +29,7 @@ If you're using the [classic version of Analytics Node.js](/docs/connections/sou After: ```javascript - const analytics = new Analytics({ writeKey: '' }) + const analytics = new Analytics({ writeKey: '' }) ``` 3. Change flushing to [graceful shutdown](/docs/connections/sources/catalog/libraries/server/node//#graceful-shutdown). From 0d069f012db8ab170a969206f1e7b4cdf331b15f Mon Sep 17 00:00:00 2001 From: stayseesong Date: Fri, 13 Jan 2023 14:48:45 -0800 Subject: [PATCH 11/15] [netlify-build] --- .../catalog/libraries/server/node/classic.md | 4 ++-- .../catalog/libraries/server/node/index.md | 15 +++++++-------- 2 files changed, 9 insertions(+), 10 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/classic.md b/src/connections/sources/catalog/libraries/server/node/classic.md index 13a4f57c1f..d8bb4159e1 100644 --- a/src/connections/sources/catalog/libraries/server/node/classic.md +++ b/src/connections/sources/catalog/libraries/server/node/classic.md @@ -2,11 +2,11 @@ title: Analytics for Node.js Classic repo: analytics-node strat: node-js -hidden: true +hidden: false --- > info "Upgrade to the new version of Analytics Node.js" -> Upgrade to the new version of Analytics Node.js. See the updated [Analytics Node.js docs](/docs/connections/sources/catalog/libraries/server/node) to learn more. +> There's a new version of Analytics Node.js. [Upgrade](/docs/connections/sources/catalog/libraries/server/node/migration/) to the latest version. See the updated [Analytics Node.js docs](/docs/connections/sources/catalog/libraries/server/node) to learn more. Segment's Node.js library lets you record analytics data from your node code. The requests hit Segment's servers, and then Segment routes your data to any destinations you have enabled. diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index 38f1a77f5d..fd56b77dfc 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -391,13 +391,13 @@ Plugins are bound by Analytics.js 2.0 which handles operations such as observabi Non-critical plugins run through a timeline that executes in order of insertion based on the entry type. Segment has these five entry types of non-critical plugins: -| Type | Details | -| ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | -| `before` | Executes before event processing begins. These are plugins that run before any other plugins run.

For example, validating events before passing them along to other plugins. A failure here could halt the event pipeline.

See the example of how Analytics.js uses the [Event Validation plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/validation/index.ts){:target="_blank"} to verify that every event has the correct shape. | -| `enrichment` | Executes as the first level of event processing. These plugins modify an event.

See the example of how Analytics.js uses the [Page Enrichment plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/page-enrichment/index.ts){:target="_blank"} to enrich every event with page information. | -| `destination` | Executes as events begin to pass off to destinations.

This doesn't modify the event outside of the specific destination, and failure doesn't halt the execution. | -| `after` | Executes after all event processing completes. You can use this to perform cleanup operations.

An example of this is the [Segment.io Plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/segmentio/index.ts){:target="_blank"} which waits for destinations to succeed or fail so it can send it observability metrics. | -| `utility` | Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality. | +| Type | Details +------ | -------- +| `before` | Executes before event processing begins. These are plugins that run before any other plugins run.

For example, validating events before passing them along to other plugins. A failure here could halt the event pipeline.

See the example of how Analytics.js uses the [Event Validation plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/validation/index.ts){:target="_blank"} to verify that every event has the correct shape. +| `enrichment` | Executes as the first level of event processing. These plugins modify an event.

See the example of how Analytics.js uses the [Page Enrichment plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/page-enrichment/index.ts){:target="_blank"} to enrich every event with page information. +| `destination` | Executes as events begin to pass off to destinations.

This doesn't modify the event outside of the specific destination, and failure doesn't halt the execution. +| `after` | Executes after all event processing completes. You can use this to perform cleanup operations.

An example of this is the [Segment.io Plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/segmentio/index.ts){:target="_blank"} which waits for destinations to succeed or fail so it can send it observability metrics. +| `utility` | Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality. ### Example plugins Here's an example of a plugin that converts all track event names to lowercase before the event goes through the rest of the pipeline: @@ -455,7 +455,6 @@ const identityStitching = () => { return identity } - ``` You can view Segment's [existing plugins](https://github.com/segmentio/analytics-next/tree/master/src/plugins){:target="_blank"} to see more examples. From 59f1907b4dede0ca187f932f2db1a256fa2f5e0e Mon Sep 17 00:00:00 2001 From: stayseesong Date: Tue, 17 Jan 2023 09:40:52 -0800 Subject: [PATCH 12/15] edits --- src/connections/sources/catalog/libraries/server/node/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index fd56b77dfc..8c12861568 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -348,7 +348,7 @@ analytics.on('error', (err) => console.error(err)) ### Event emitter interface -The event emitter interface allows you to track events, such as `track` and `identify` calls, and it calls the function you provided with some arguments upon successful delivery. `error` emits on delivery error. See the complete list of emitted events in the [GitHub Node repository](https://github.com/segmentio/analytics-next/blob/master/packages/node/src/app/emitter.ts). +The event emitter interface allows you to track events, such as `track` and `identify` calls, and it calls the function you provided with some arguments upon successful delivery. `error` emits on delivery error. ```javascript analytics.on('error', (err) => console.error(err)) From 7e26349f3396c51b861a7fa1ba92a28cc7a6c2ac Mon Sep 17 00:00:00 2001 From: stayseesong <83784848+stayseesong@users.noreply.github.com> Date: Wed, 18 Jan 2023 17:15:57 -0800 Subject: [PATCH 13/15] Apply suggestions from code review Co-authored-by: markzegarelli --- .../sources/catalog/libraries/server/node/classic.md | 4 ++-- .../sources/catalog/libraries/server/node/migration.md | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/src/connections/sources/catalog/libraries/server/node/classic.md b/src/connections/sources/catalog/libraries/server/node/classic.md index d8bb4159e1..f2df7647eb 100644 --- a/src/connections/sources/catalog/libraries/server/node/classic.md +++ b/src/connections/sources/catalog/libraries/server/node/classic.md @@ -92,7 +92,7 @@ Field | Details `anonymousId` _String, optional_ | An ID associated with the user when you don't know who they are (for example, [the anonymousId generated by `analytics.js`](/docs/connections/sources/catalog/libraries/website/javascript/#anonymous-id)). _Note: You must include at least one of `userId` or `anonymousId` in all identify calls._ `traits` _Object, optional_ | A dictionary of [traits](/docs/connections/spec/identify#traits) you know about the user. Things like: `email`, `name` or `friends`. `timestamp` _Date, optional_ | A JavaScript date object representing when the identify call took place. If the identify just happened, leave it out as Segment will use the server's time. If you're importing data from the past make sure you to send a `timestamp`. -`context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ +`context` _Object, optional_ | A dictionary of extra [context](/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ Find details on the **identify method payload** in the Segment [Spec](/docs/connections/spec/identify/). @@ -143,7 +143,7 @@ Field | Details `event` _String_ | The name of the event you're tracking. Segment recommends you use human-readable names like `Song Played` or `Status Updated`. `properties` _Object, optional_ | A dictionary of properties for the event. If the event was `Product Added`, it might have properties like `price` or `product`. `timestamp` _Date, optional_ | A JavaScript date object representing when the track took place. If the track just happened, leave it out as Segment will use the server's time. If you're importing data from the past make sure you to send a `timestamp`. -`context` _Object, optional_ | A dictionary of extra [context](https://segment.com/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ +`context` _Object, optional_ | A dictionary of extra [context](/docs/connections/spec/common/#context) to attach to the call. _Note: `context` differs from `traits` because it is not attributes of the user itself._ Find details on **best practices in event naming** as well as the **`track` method payload** in the Segment [Spec](/docs/connections/spec/track/). diff --git a/src/connections/sources/catalog/libraries/server/node/migration.md b/src/connections/sources/catalog/libraries/server/node/migration.md index b80245bd44..0e501be368 100644 --- a/src/connections/sources/catalog/libraries/server/node/migration.md +++ b/src/connections/sources/catalog/libraries/server/node/migration.md @@ -31,7 +31,7 @@ If you're using the [classic version of Analytics Node.js](/docs/connections/sou ```javascript const analytics = new Analytics({ writeKey: '' }) ``` -3. Change flushing to [graceful shutdown](/docs/connections/sources/catalog/libraries/server/node//#graceful-shutdown). +3. Change flushing to [graceful shutdown](/docs/connections/sources/catalog/libraries/server/node/#graceful-shutdown).
Before: ```javascript From 1b65026775a1074abac66a8de41b4a4fb3ed3c34 Mon Sep 17 00:00:00 2001 From: stayseesong <83784848+stayseesong@users.noreply.github.com> Date: Thu, 19 Jan 2023 08:45:21 -0800 Subject: [PATCH 14/15] Update src/connections/sources/catalog/libraries/server/node/index.md --- src/connections/sources/catalog/libraries/server/node/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/connections/sources/catalog/libraries/server/node/index.md b/src/connections/sources/catalog/libraries/server/node/index.md index 8c12861568..439300def3 100644 --- a/src/connections/sources/catalog/libraries/server/node/index.md +++ b/src/connections/sources/catalog/libraries/server/node/index.md @@ -377,7 +377,7 @@ Use the emitter to log all HTTP Requests. ## Plugin architecture -When you develop against [Analytics.js 2.0](/docs/connections/sources/catalog/libraries/website/javascript/), the plugins you write can augment functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done. +When you develop in [Analytics.js 2.0](/docs/connections/sources/catalog/libraries/website/javascript/), the plugins you write can improve functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done. Though middlewares function the same as plugins, it's best to use plugins as they are easier to implement and are more testable. From 72937f92b8550e589e903dc9c319d64b0d6bbd9b Mon Sep 17 00:00:00 2001 From: stayseesong <83784848+stayseesong@users.noreply.github.com> Date: Thu, 19 Jan 2023 09:24:27 -0800 Subject: [PATCH 15/15] Update src/connections/sources/catalog/libraries/server/node/migration.md --- .../sources/catalog/libraries/server/node/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/connections/sources/catalog/libraries/server/node/migration.md b/src/connections/sources/catalog/libraries/server/node/migration.md index 0e501be368..ee06225e5e 100644 --- a/src/connections/sources/catalog/libraries/server/node/migration.md +++ b/src/connections/sources/catalog/libraries/server/node/migration.md @@ -45,7 +45,7 @@ If you're using the [classic version of Analytics Node.js](/docs/connections/sou await analytics.closeAndFlush() ``` -### Differences to note between the classic and updated version +### Key differences between the classic and updated version * The callback call signature changed.