Analytics for Node.js
Flagship libraries offer the most up-to-date functionality on Segment’s most popular platforms. Segment actively maintains flagship libraries, which benefit from new feature releases and ongoing development and support.
Segment’s Analytics Node.js library lets you record analytics data from your node code. The requests hit Segment’s servers, and then Segment routes your data to any destinations you have enabled.
The Segment Analytics Node.js Next library is open-source on GitHub.
All of Segment’s server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make Identify and Track calls non-blocking and fast. It also batches messages and flushes asynchronously to Segment’s servers.
Getting Started
Make sure you’re using a version of Node that’s 16 or higher.
-
Run the relevant command to add Segment’s Node library module to your
package.json
.# npm npm install @segment/analytics-node # yarn yarn add @segment/analytics-node # pnpm pnpm install @segment/analytics-node
-
Initialize the
Analytics
constructor the module exposes with your Segment source Write Key, like so:import { Analytics } from '@segment/analytics-node' // or, if you use require: const { Analytics } = require('@segment/analytics-node') // instantiation const analytics = new Analytics({ writeKey: '<YOUR_WRITE_KEY>' })
Be sure to replace YOUR_WRITE_KEY
with your actual Write Key which you can find in Segment by navigating to: Connections > Sources and selecting your source and going to the Settings tab.
This creates an instance of Analytics
that you can use to send data to Segment for your project. The default initialization settings are production-ready and queue 20 messages before sending any requests.
Basic tracking methods
The basic tracking methods below serve as the building blocks of your Segment tracking. They include Identify, Track, Page, Group, and Alias.
These methods correspond with those used in the Segment Spec. The documentation on this page explains how to use these methods in Analytics Node.js Next.
Identify
Good to know
For any of the different methods described on this page, you can replace the properties and traits in the code samples with variables that represent the data collected.
Identify lets you tie a user to their actions and record traits about them. It includes a unique User ID and/or anonymous ID, and any optional traits you know about them.
You should call Identify once when the user’s account is first created, and then again any time their traits change.
Example of an anonymous Identify call:
analytics.identify({
anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe',
traits: {
friends: 42
}
});
This call identifies the user and records their unique anonymous ID, and labels them with the friends
trait.
Example of an Identify call for an identified user:
analytics.identify({
userId: '019mr8mf4r',
traits: {
name: 'Michael Bolton',
email: 'mbolton@example.com',
plan: 'Enterprise',
friends: 42
}
});
The call above identifies Michael by his unique User ID (the one you know him by in your database), and labels him with the name
, email
, plan
and friends
traits.
An Identify call has the following fields:
Field | Details |
---|---|
userId String, optional |
The ID for this user in your database. Note: at least one of userId or anonymousId must be included in any identify call. |
anonymousId String, optional |
An ID associated with the user when you don’t know who they are (for example, the anonymousId generated by analytics.js ). Note: You must include at least one of userId or anonymousId in all identify calls. |
traits Object, optional |
A dictionary of traits you know about the user. Things like: email , name or friends . |
timestamp Date, optional |
A JavaScript date object representing when the identify took place. If the identify just happened, leave it out as Segment uses the server’s time. If you’re importing data from the past make sure to send a timestamp . |
context Object, optional |
A dictionary of extra context to attach to the call. Note: context differs from traits because it is not attributes of the user itself. |
Find details on the identify method payload in Segment’s Spec.
Track
Track lets you record the actions your users perform. Every action triggers what Segment calls an “event”, which can also have associated properties.
You’ll want to track events that are indicators of success for your site, like Signed Up, Item Purchased or Article Bookmarked.
To get started, Segment recommends tracking just a few important events. You can always add more later.
Example anonymous Track call:
analytics.track({
anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe',
event: 'Item Purchased',
properties: {
revenue: 39.95,
shippingMethod: '2-day'
}
});
Example identified Track call:
analytics.track({
userId: '019mr8mf4r',
event: 'Item Purchased',
properties: {
revenue: 39.95,
shippingMethod: '2-day'
}
});
This example Track call tells you that your user just triggered the Item Purchased event with a revenue of $39.95 and chose your hypothetical ‘2-day’ shipping.
Track event properties can be anything you want to record. In this case, revenue and shipping method.
The Track call has the following fields:
Field | Details |
---|---|
userId String, optional |
The ID for this user in your database. _Note: at least one of userId or anonymousId must be included in any track call. |
anonymousId String, optional |
An ID associated with the user when you don’t know who they are (for example, the anonymousId generated by analytics.js ). Note: You must include at least one of userId or anonymousId in all track calls. |
event String |
The name of the event you’re tracking. We recommend human-readable names like Song Played or Status Updated . |
properties Object, optional |
A dictionary of properties for the event. If the event was Product Added , it might have properties like price or product . |
timestamp Date, optional |
A JavaScript date object representing when the track took place. If the track just happened, leave it out and we’ll use the server’s time. If you’re importing data from the past make sure you to send a timestamp . |
context Object, optional |
A dictionary of extra context to attach to the call. Note: context differs from traits because it is not attributes of the user itself. |
Find details on best practices in event naming as well as the Track method payload in the Segment Spec.
Page
The Page method lets you record page views on your website, along with optional extra information about the page being viewed.
If you’re using Segment’s client-side set up in combination with the Node.js library, page calls are already tracked for you by default. However, if you want to record your own page views manually and aren’t using the client-side library, read on.
Example Page call:
analytics.page({
userId: '019mr8mf4r',
category: 'Docs',
name: 'Node.js Library',
properties: {
url: 'https://segment.com/docs/connections/sources/catalog/librariesnode',
path: '/docs/connections/sources/catalog/librariesnode/',
title: 'Node.js Library - Segment',
referrer: 'https://github.com/segmentio/analytics-node'
}
});
A Page call has the following fields:
Field | Details |
---|---|
userId String, optional |
The ID for this user in your database. _Note: at least one of userId or anonymousId must be included in any page call. |
anonymousId String, optional |
An ID associated with the user when you don’t know who they are (for example, the anonymousId generated by analytics.js ). Note: at least one of userId or anonymousId must be included in any page call. |
category String, optional |
The category of the page. Useful for industries, like ecommerce, where many pages often live under a larger category. |
name String, optional |
The name of the page, for example Signup or Home. |
properties Object, optional |
A dictionary of properties of the page. A few properties specially recognized and automatically translated: url , title , referrer and path , but you can add your own too. |
timestamp Date, optional |
A JavaScript date object representing when the Page took place. If the Page just happened, leave it out and Segment will use the server’s time. If you’re importing data from the past make sure you to send a timestamp . |
context Object, optional |
A dictionary of extra context to attach to the call. Note: context differs from traits because it is not attributes of the user itself. |
Find details on the Page payload in the Segment Spec.
Group
Group lets you associate an identified user with a group. A group could be a company, organization, account, project or team. It also lets you record custom traits about the group, like industry or number of employees.
This is useful for tools like Intercom, Preact and Totango, as it ties the user to a group of other users.
Example Group call:
analytics.group({
userId: '019mr8mf4r',
groupId: '56',
traits: {
name: 'Initech',
description: 'Accounting Software'
}
});
The Group call has the following fields:
Field | Details |
---|---|
userId String, optional |
The ID for this user in your database. _Note: at least one of userId or anonymousId must be included in any group call. |
anonymousId String, optional |
An ID associated with the user when you don’t know who they are (for example), the anonymousId generated by analytics.js ). Note: at least one of userId or anonymousId must be included in any group call. |
groupId _string |
The ID of the group. |
traits dict, optional |
A dict of traits you know about the group. For a company, they might be things like name , address , or phone . Learn more about traits. |
context dict, optional |
A dict containing any context about the request. To see the full reference of supported keys, check them out in the context reference |
timestamp datetime, optional |
A datetime object representing when the Group took place. If the Group just happened, leave it out and Segment will use the server’s time. If you’re importing data from the past make sure you send timestamp . |
integrations dict, optional |
A dictionary of destinations to enable or disable. |
Find more details about Group, including the Group payload, in the Segment Spec.
Alias
The Alias call allows you to associate one identity with another. This is an advanced method and should not be widely used, but is required to manage user identities in some destinations. Other destinations do not support the alias call.
In Mixpanel it’s used to associate an anonymous user with an identified user once they sign up. For Kissmetrics, if your user switches IDs, you can use ‘alias’ to rename the ‘userId’.
Example Alias call:
analytics.alias({
previousId: 'old_id',
userId: 'new_id'
});
The Alias call has the following fields:
Field | Details |
---|---|
userId String |
The ID for this user in your database. |
previousId String |
The previous ID to alias from. |
Here’s a full example of how Segment might use the Alias call:
// the anonymous user does actions ...
analytics.track({ userId: 'anonymous_user', event: 'Anonymous Event' })
// the anonymous user signs up and is aliased
analytics.alias({ previousId: 'anonymous_user', userId: 'identified@example.com' })
// the identified user is identified
analytics.identify({ userId: 'identified@example.com', traits: { plan: 'Free' } })
// the identified user does actions ...
analytics.track({ userId: 'identified@example.com', event: 'Identified Action' })
For more details about Alias, including the Alias call payload, check out the Segment Spec.
Configuration
The second argument to the Analytics
constructor is an optional list of settings to configure the module.
const analytics = new Analytics({
writeKey: '<MY_WRITE_KEY>',
host: 'https://api.segment.io',
path: '/v1/batch',
maxRetries: 3,
flushAt: 15,
flushInterval: 10000,
// ... and more!
})
Setting | Details |
---|---|
writeKey string |
The key that corresponds to your Segment.io project |
host string |
The base URL of the API. The default is: “https://api.segment.io” |
path string |
The API path route. The default is: “/v1/batch” |
maxRetries number |
The number of times to retry flushing a batch. The default is: 3 |
flushAt number |
The number of messages to enqueue before flushing. The default is: 15 |
flushInterval number |
The number of milliseconds to wait before flushing the queue automatically. The default is: 10000 |
httpRequestTimeout number |
The maximum number of milliseconds to wait for an http request. The default is: 10000 |
disable boolean |
Disable the analytics library for testing. The default is: false |
httpClient HTTPClient or HTTPClientFn |
A custom HTTP Client implementation to support alternate libraries or proxies. Defaults to global fetch or node-fetch for older versions of node. See the Overriding the default HTTP Client section for more details. |
See the complete AnalyticsSettings
interface in the analytics-next repository.
Usage in serverless environments
When calling Track within functions in serverless runtime environments, wrap the call in a Promise
and await
it to avoid having the runtime exit or freeze:
await new Promise((resolve) =>
analytics().track({ ... }, resolve)
)
See the complete documentation on Usage in AWS Lambda, Usage in Vercel Edge Functions, and Usage in Cloudflare Workers
Graceful shutdown
Avoid losing events after shutting down your console. Call .closeAndFlush()
to stop collecting new events and flush all existing events. If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved.
await analytics.closeAndFlush()
// or
await analytics.closeAndFlush({ timeout: 5000 }) // force resolve after 5000ms
Here’s an example of how to use graceful shutdown:
const app = express()
const server = app.listen(3000)
const onExit = async () => {
await analytics.closeAndFlush()
server.close(() => {
console.log("Gracefully closing server...")
process.exit()
})
}
['SIGINT', 'SIGTERM'].forEach((code) => process.on(code, onExit))
Collect unflushed events
If you need to preserve all of your events in the instance of a forced timeout, even ones that came in after analytics.closeAndFlush() was called, you can still collect those events by using:
const unflushedEvents = []
analytics.on('call_after_close', (event) => unflushedEvents.push(events))
await analytics.closeAndFlush()
console.log(unflushedEvents) // all events that came in after closeAndFlush was called
Regional configuration
For Business plans with access to Regional Segment, you can use the host
configuration parameter to send data to the desired region:
- Oregon (Default) —
api.segment.io/v1
- Dublin —
events.eu1.segmentapis.com
An example of setting the host to the EU endpoint using the Node library is:
const analytics = new Analytics({
...
host: "https://events.eu1.segmentapis.com"
});
Error handling
To keep track of errors, subscribe and log all event delivery errors by running:
const analytics = new Analytics({ writeKey: '<MY_WRITE_KEY>' })
analytics.on('error', (err) => console.error(err))
Event emitter interface
The event emitter interface allows you to track events, like Track and Identify calls, and it calls the function you provided with some arguments upon successful delivery. error
emits on delivery error.
analytics.on('error', (err) => console.error(err))
analytics.on('identify', (ctx) => console.log(ctx))
analytics.on('track', (ctx) => console.log(ctx))
Use the emitter to log all HTTP Requests.
analytics.on('http_request', (event) => console.log(event))
// when triggered, emits an event of the shape:
{
url: 'https://api.segment.io/v1/batch',
method: 'POST',
headers: {
'Content-Type': 'application/json',
...
},
body: '...',
}
Plugin architecture
When you develop in Analytics.js 2.0, the plugins you write can improve functionality, enrich data, and control the flow and delivery of events. From modifying event payloads to changing analytics functionality, plugins help to speed up the process of getting things done.
Though middlewares function the same as plugins, it’s best to use plugins as they are easier to implement and are more testable.
Plugin categories
Plugins are bound by Analytics.js 2.0 which handles operations such as observability, retries, and error handling. There are two different categories of plugins:
- Critical Plugins: Analytics.js expects this plugin to be loaded before starting event delivery. Failure to load a critical plugin halts event delivery. Use this category sparingly, and only for plugins that are critical to your tracking.
- Non-critical Plugins: Analytics.js can start event delivery before this plugin finishes loading. This means your plugin can fail to load independently from all other plugins. For example, every Analytics.js destination is a non-critical plugin. This makes it possible for Analytics.js to continue working if a partner destination fails to load, or if users have ad blockers turned on that are targeting specific destinations.
Non-critical plugins are only non-critical from a loading standpoint. For example, if the before
plugin crashes, this can still halt the event delivery pipeline.
Non-critical plugins run through a timeline that executes in order of insertion based on the entry type. Segment has these five entry types of non-critical plugins:
Type | Details |
---|---|
before |
Executes before event processing begins. These are plugins that run before any other plugins run. For example, validating events before passing them along to other plugins. A failure here could halt the event pipeline. See the example of how Analytics.js uses the Event Validation plugin to verify that every event has the correct shape. |
enrichment |
Executes as the first level of event processing. These plugins modify an event. See the example of how Analytics.js uses the Page Enrichment plugin to enrich every event with page information. |
destination |
Executes as events begin to pass off to destinations. This doesn’t modify the event outside of the specific destination, and failure doesn’t halt the execution. |
after |
Executes after all event processing completes. You can use this to perform cleanup operations. An example of this is the Segment.io Plugin which waits for destinations to succeed or fail so it can send it observability metrics. |
utility |
Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality. |
Example plugins
Here’s an example of a plugin that converts all track event names to lowercase before the event goes through the rest of the pipeline:
export const lowercase: Plugin = {
name: 'Lowercase events',
type: 'enrichment',
version: '1.0.0',
isLoaded: () => true,
load: () => Promise.resolve(),
track: (ctx) => {
ctx.updateEvent('event', ctx.event.event.toLowerCase())
return ctx
}
}
const identityStitching = () => {
let user
const identity = {
// Identifies your plugin in the Plugins stack.
// Access `window.analytics.queue.plugins` to see the full list of plugins
name: 'Identity Stitching',
// Defines where in the event timeline a plugin should run
type: 'enrichment',
version: '0.1.0',
// Used to signal that a plugin has been property loaded
isLoaded: () => user !== undefined,
// Applies the plugin code to every `identify` call in Analytics.js
// You can override any of the existing types in the Segment Spec.
async identify(ctx) {
// Request some extra info to enrich your `identify` events from
// an external API.
const req = await fetch(
`https://jsonplaceholder.typicode.com/users/${ctx.event.userId}`
)
const userReq = await req.json()
// ctx.updateEvent can be used to update deeply nested properties
// in your events. It's a safe way to change events as it'll
// create any missing objects and properties you may require.
ctx.updateEvent('traits.custom', userReq)
user.traits(userReq)
// Every plugin must return a `ctx` object, so that the event
// timeline can continue processing.
return ctx
},
}
return identity
}
You can view Segment’s existing plugins to see more examples.
Register a plugin
Registering plugins enable you to modify your analytics implementation to best fit your needs. You can register a plugin using this:
// A promise will resolve once the plugins have been successfully loaded into Analytics.js
// Register multiple plugins at once by using the variable args interface in Analytics.js
await analytics.register(pluginA, pluginB, pluginC)
Deregister a plugin
Deregister a plugin by using:
await analytics.deregister("pluginNameA", "pluginNameB") // takes strings
Selecting Destinations
The Alias, Group, Identify, Page, and Track calls can all be passed an object of integrations
that lets you turn certain destinations on or off. By default all destinations are enabled.
Here’s an example with the integrations
object shown:
analytics.track({
event: 'Membership Upgraded',
userId: '97234974',
integrations: {
'All': false,
'Vero': true,
'Google Analytics': false
}
})
In this case, Segment specifies that they want this Track event to only go to Vero. All: false
says that no destination should be enabled unless otherwise specified. Vero: true
turns on Vero.
Destination flags are case sensitive and match the destination’s name in the docs (for example, “AdLearn Open Platform”, “awe.sm”, or “MailChimp”). In some cases, there may be several names for a destination; if that happens you’ll see a “Adding (destination name) to the Integrations Object” section in the destination’s doc page with a list of valid names.
Note:
-
Business Tier users can filter Track calls right from the Segment UI on your source schema page. Segment recommends using the UI if possible since it’s a much simpler way of managing your filters and can be updated with no code changes on your side.
-
If you are on a grandfathered plan, events sent server-side that are filtered through the Segment dashboard still count towards your API usage.
Historical Import
You can import historical data by adding the timestamp
argument to any of your method calls. This can be helpful if you’ve just switched to Segment.
Historical imports can only be done into destinations that can accept historical timestamped data. Most analytics tools like Mixpanel, Amplitude, Kissmetrics, etc. can handle that type of data just fine. One common destination that does not accept historical data is Google Analytics since their API cannot accept historical data.
Note: If you’re tracking things that are happening right now, leave out the timestamp
and Segment’s servers will timestamp the requests for you.
Batching
Segment’s libraries are built to support high performance environments. That means it is safe to use Segment’s Node library on a web server that’s serving hundreds of requests per second.
Every method you call doesn’t result in a HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation.
By default, Segment’s library will flush:
- Every 15 messages (controlled by
settings.flushAt
). - If 10 seconds has passed since the last flush (controlled by
settings.flushInterval
)
There is a maximum of 500KB
per batch request and 32KB
per call.
If you don’t want to batch messages, you can turn batching off by setting the flushAt
setting to 1
, like so:
const analytics = new Analytics({
...
flushAt: 1
});
Batching means that your message might not get sent right away. Every method call takes an optional callback
, which you can use to know when a particular message is flushed from the queue, like so:
analytics.track({
userId: '019mr8mf4r',
event: 'Ultimate Played',
},
(err, ctx) => {
...
}
)
Multiple Clients
Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of Analytics
with different settings:
const marketingAnalytics = new Analytics({ writeKey: 'MARKETING_WRITE_KEY' });
const appAnalytics = new Analytics({ writeKey: 'APP_WRITE_KEY' });
Override the default HTTP Client
Segment attempts to use the global fetch
implementation if available in order to support several diverse environments. Some special cases (for example, http proxy) may require a different implementation for http communication. You can provide a customized wrapper in the Analytics configuration to support this. Here are a few approaches:
Use a custom fetch-like implementation with proxy (simple, recommended)
import { HTTPFetchFn } from '../lib/http-client'
import axios from 'axios'
const httpClient: HTTPFetchFn = async (url, { body, ...options }) =>
axios({
url,
data: body,
proxy: {
protocol: 'http',
host: 'proxy.example.com',
port: 8886,
auth: {
username: 'user',
password: 'pass',
},
},
...options,
})
const analytics = new Analytics({
writeKey: '<YOUR_WRITE_KEY>',
httpClient,
})
Augment the default HTTP Client
import { FetchHTTPClient, HTTPClientRequest } from '@segment/analytics-node'
class MyClient extends FetchHTTPClient {
async makeRequest(options: HTTPClientRequest) {
return super.makeRequest({
...options,
headers: { ...options.headers, foo: 'bar' }
}})
}
}
const analytics = new Analytics({
writeKey: '<YOUR_WRITE_KEY>',
httpClient: new MyClient()
})
Completely override the full HTTPClient (Advanced, you probably don’t need to do this)
import { HTTPClient, HTTPClientRequest } from '@segment/analytics-node'
class CustomClient implements HTTPClient {
async makeRequest(options: HTTPClientRequest) {
return someRequestLibrary(options.url, {
method: options.method,
body: JSON.stringify(options.data) // serialize data
headers: options.headers,
})
}
}
const analytics = new Analytics({
writeKey: '<YOUR_WRITE_KEY>',
httpClient: new CustomClient()
})
Override context value
analytics.track({
anonymousId: '48d213bb-95c3-4f8d-af97-86b2b404dcfe',
event: 'New Test',
properties: {
revenue: 39.95,
shippingMethod: '2-day'
},
context: {
traits: {
"email": "test@test.com"
}
}
});
OAuth 2.0
Enable OAuth 2.0 in your Segment workspace to guarantee authorized communication between your server environment and Segment’s Tracking API. To support the non-interactive server environment, the OAuth workflow used is a signed client assertion JWT.
You will need a public and private key pair where:
- The public key is uploaded to the Segment dashboard.
- The private key is kept in your server environment to be used by this SDK.
Your server will verify its identity by signing a token request and will receive a token that is used to to authorize all communication with the Segment Tracking API.
You’ll need to provide the OAuth Application ID and the public key’s ID, both of which are provided in the Segment dashboard. There are also options available to specify the authorization server, custom scope, maximum number of retries, or a custom HTTP client if your environment has special rules for separate Segment endpoints.
Be sure to implement handling for Analytics SDK errors. Good logging helps distinguish any configuration issues.
For more information, see the Segment OAuth 2.0 documentation.
import { Analytics, OAuthSettings } from '@segment/analytics-node';
import { readFileSync } from 'fs'
const privateKey = readFileSync('private.pem', 'utf8')
const settings: OAuthSettings = {
clientId: '<CLIENT_ID_FROM_DASHBOARD>',
clientKey: privateKey,
keyId: '<PUB_KEY_ID_FROM_DASHBOARD>',
}
const analytics = new Analytics({
writeKey: '<MY_WRITE_KEY>',
oauthSettings: settings,
})
analytics.on('error', (err) => { console.error(err) })
analytics.track({ userId: 'foo', event: 'bar' })
Troubleshooting
The following tips often help resolve common issues.
No events in my debugger
-
Double check that you’ve followed all the steps in the Quickstart.
-
Make sure that you’re calling a Segment API method once the library is successfully installed—
identify
,track
, etc. -
Make sure your application isn’t shutting down before the
Analytics.Client
local queue events are pushed to Segment. You can manually callAnalytics.Client.Flush()
to ensure the queue is fully processed before shutdown.
Other common errors
If you are experiencing data loss from your source, you may be experiencing one or more of the following common errors:
-
Payload is too large: If you attempt to send events larger than 32KB per normal API request or batches of events larger than 500KB per request, Segment’s tracking API responds with
400 Bad Request
. Try sending smaller events (or smaller batches) to correct this error. -
Identifier is not present: Segment’s tracking API requires that each payload has a
userId
and/oranonymousId
. If you send events without either theuserId
oranonymousId
, Segment’s tracking API responds with anno_user_anon_id
error. Check the event payload and client instrumentation for more details. -
Track event is missing name: All Track events to Segment must have a name in string format.
-
Event dropped during deduplication: Segment automatically adds a
messageId
field to all payloads and uses this value to deduplicate events. If you’re manually setting amessageId
value, ensure that each event has a unique value. -
Incorrect credentials: Double check your credentials for your downstream destination(s).
-
Destination incompatibility: Make sure that the destination you are troubleshooting can accept server-side API calls. You can see compatibility information on the Destination comparison by category page and in the documentation for your specific destination.
-
Destination-specific requirements: Check out the destination’s documentation to see if there are other requirements for using the method and destination that you’re trying to get working.
This page was last modified: 24 Jun 2024
Need support?
Questions? Problems? Need more info? Contact Segment Support for assistance!