Examples
Basic Stream (v1)
Fetch metrics using the Datadog v1 API with the generic Stream constructor.
All examples use stream_create("datadog_v1", config_json) under the hood —
no Datadog-specific fields in the binding layer. Progress tracking and adaptive
retry are built into every stream.
"""
basic_stream.py - stream a Datadog metric and print per-batch statistics.
Usage:
python basic_stream.py <api_key> <app_key> <query> [from_ts] [to_ts]
Example:
python basic_stream.py $DD_API_KEY $DD_APP_KEY "avg:system.cpu.user{*}"
"""
import sys
import time
from ddstream import Stream
def main():
if len(sys.argv) < 4:
print(__doc__)
sys.exit(1)
...V2 Timeseries
Stream Datadog v2 timeseries with multi-query support, formulas, and group-by
tags. Uses "datadog_v2_ts" adapter with a POST body containing queries and
interval configuration. Each batch returns a times array and a list of series
with per-series group tags.
"""
v2_timeseries.py - stream Datadog v2 timeseries with formulas and group-by tags.
Usage:
python v2_timeseries.py <api_key> <app_key> [from_ts] [to_ts]
"""
import sys
import time
from ddstream import Stream
def main():
if len(sys.argv) < 3:
print(__doc__)
sys.exit(1)
now = int(time.time())
config = {
"api_key": sys.argv[1],
...CSV Export
Stream metrics and write each data point to a CSV file with columns
timestamp,avg,min,max,sum.
"""
to_csv.py - stream a Datadog metric and write it to a CSV file.
Usage:
python to_csv.py <api_key> <app_key> <query> <output.csv> [from_ts] [to_ts]
"""
import csv
import sys
import time
from ddstream import Stream
def main():
if len(sys.argv) < 5:
print(__doc__)
sys.exit(1)
now = int(time.time())
config = {
...Pandas / DataFrames
Accumulate all batches into a pandas DataFrame indexed by timestamp.
Useful for local analysis, plotting with matplotlib, or writing to parquet.
"""
to_pandas.py - stream a Datadog metric into a pandas DataFrame.
Usage:
python to_pandas.py <api_key> <app_key> <query> [from_ts] [to_ts]
Requirements:
pip install pandas
"""
import sys
import time
import pandas as pd
from ddstream import Stream
def stream_to_dataframe(adapter_name: str, config: dict) -> pd.DataFrame:
chunks = []
with Stream(adapter_name, config) as stream:
for batch in stream:
...JSONL Export
Stream metrics from Node.js and write each data point as a newline-delimited JSON record - compatible with BigQuery, Elasticsearch, and other line-JSON consumers.
/**
* to_jsonl.js - stream a Datadog metric and write each point as a JSON line.
*
* Usage:
* node to_jsonl.js <api_key> <app_key> <query> <output.jsonl> [from_ts] [to_ts]
*/
'use strict';
const fs = require('fs');
const { Stream } = require('../../source/bindings/nodejs/stream');
async function main() {
const args = process.argv.slice(2);
if (args.length < 4) {
console.log('Usage: node to_jsonl.js <api_key> <app_key> <query> <output.jsonl> [from_ts] [to_ts]');
process.exit(1);
}
const now = Math.floor(Date.now() / 1000);
...