Node.js Guide
Installation
cmake -B build -DCMAKE_BUILD_TYPE=Release cmake --build build -j$(nproc) cd source/bindings/nodejs npm install && npm run build
Config & Creation
The Node.js binding uses camelCase property names. Timestamps are BigInt
(64-bit Unix seconds); chunk/interval sizes are plain numbers.
const { DDStream } = require('./source/bindings/nodejs')
const now = BigInt(Math.floor(Date.now() / 1000))
const stream = new DDStream({
apiKey: 'YOUR_DD_API_KEY',
appKey: 'YOUR_DD_APP_KEY',
query: 'avg:system.cpu.user{*}',
rangeStart: now - 86400n, // last 24 h (BigInt arithmetic)
rangeEnd: now,
granularity: 60, // plain Number
chunkSeconds: 14400, // plain Number
enablePrefetch: true,
logLevel: 0, // 0=NONE … 4=DEBUG
})
// Validate credentials without streaming
const err = await DDStream.validate({
apiKey: 'YOUR_DD_API_KEY',
appKey: 'YOUR_DD_APP_KEY',
})
if (err) throw new Error(err)BigInt timestamps
rangeStart and rangeEnd must be BigInt values. Plain Number will
lose precision for Unix timestamps beyond 2^53. Use the n suffix literal
(86400n) or BigInt(value) for arithmetic.
Async/Await
Fetches run on the libuv thread pool so the event loop is never blocked.
Use for await…of or the explicit pull API:
stream.reset()
for await (const batch of stream) {
console.log(`${batch.count} points, progress ${(stream.progress()*100).toFixed(1)}%`)
}
stream.destroy()stream.reset()
while (stream.hasNext()) {
const batch = await stream.nextBatch()
if (batch === null) {
console.error('Error:', stream.lastError())
break
}
console.log(`Batch: ${batch.count} points`)
if (batch.isLast) break
}
stream.destroy()Typed Arrays
Times are BigInt64Array (Unix milliseconds for v2, seconds for v1).
Value arrays are Float64Array.
const batch = await stream.nextBatch()
// v1 batch shape
const times = batch.times // BigInt64Array - epoch ms (v2) or seconds (v1)
const avgs = batch.avg // Float64Array
const mins = batch.min // Float64Array
const maxs = batch.max // Float64Array
const sums = batch.sum // Float64Array
// Compute statistics
let totalAvg = 0
for (let i = 0; i < batch.count; i++) {
totalAvg += avgs[i]
}
console.log(`Mean CPU: ${(totalAvg / batch.count).toFixed(2)}%`)
// Chunk metadata
console.log(`Chunk: ${batch.chunkStartMs}–${batch.chunkEndMs} ms`)
console.log(`Is last: ${batch.isLast}`)Cleanup
Always call stream.destroy() when finished. This frees the underlying
C++ handle and cancels any in-flight prefetch. If you call destroy()
while a nextBatch() is pending, it cancels the fetch and defers the
handle free until the worker completes.
// Pattern: wrap in try/finally
stream.reset()
try {
for await (const batch of stream) {
// process batch
}
} finally {
stream.destroy()
}