Skip to content
Last9
Book demo

n8n — OpenTelemetry

Instrument self-hosted n8n with OpenTelemetry to ship workflow execution traces, node spans, and error details to Last9.

n8n is a Node.js application, which means OpenTelemetry instruments it the same way as any other Node service: a tracing.js bootstrap file loaded via NODE_OPTIONS=--require. Auto-instrumentations cover HTTP, database, and runtime spans automatically.

The result: every workflow execution becomes a trace in Last9 with span-level visibility into which nodes ran, how long each took, and which external APIs or databases were involved.

What you get

  • Per-execution traces — one trace per workflow run, with a span tree showing which nodes ran, in what order, and how long each took
  • Failed-node visibility — when a workflow errors out, the span carries the error message and stack trace
  • External call latency — HTTP/DB spans from auto-instrumentation show whether a slow workflow is actually a slow upstream API
  • Throughput and error-rate metrics — derived from traces, queryable in PromQL from Metrics Explorer

Prerequisites

Setup

  1. Get your Last9 OTLP credentials

    From Integrations → OpenTelemetry in the Last9 dashboard. You need:

    • Endpointhttps://<region>.last9.io (shown in the dashboard)
    • Auth header — the full Authorization=Basic <token> string
  2. Create a tracing.js bootstrap file

    This file initializes the OTel SDK before n8n starts.

    // tracing.js
    "use strict";
    const { NodeSDK } = require("@opentelemetry/sdk-node");
    const {
    getNodeAutoInstrumentations,
    } = require("@opentelemetry/auto-instrumentations-node");
    const {
    OTLPTraceExporter,
    } = require("@opentelemetry/exporter-trace-otlp-http");
    // Parse OTEL_EXPORTER_OTLP_HEADERS (format: "Key=Value,Key2=Value2")
    const rawHeaders = process.env.OTEL_EXPORTER_OTLP_HEADERS || "";
    const parsedHeaders = Object.fromEntries(
    rawHeaders
    .split(",")
    .filter(Boolean)
    .map((pair) => {
    const idx = pair.indexOf("=");
    return [pair.slice(0, idx).trim(), pair.slice(idx + 1).trim()];
    }),
    );
    const sdk = new NodeSDK({
    traceExporter: new OTLPTraceExporter({
    url:
    (process.env.OTEL_EXPORTER_OTLP_ENDPOINT || "").replace(/\/$/, "") +
    "/v1/traces",
    headers: parsedHeaders,
    }),
    instrumentations: [
    getNodeAutoInstrumentations({
    // Reduce noise from internal file system calls
    "@opentelemetry/instrumentation-fs": { enabled: false },
    }),
    ],
    });
    sdk.start();
    console.log("[otel] SDK started, service:", process.env.OTEL_SERVICE_NAME);
    // Graceful shutdown — flush pending spans before process exits
    const shutdown = () => {
    sdk
    .shutdown()
    .then(() => process.exit(0))
    .catch(() => process.exit(1));
    };
    process.on("SIGTERM", shutdown);
    process.on("SIGINT", shutdown);
  3. Create a custom Docker image

    The official n8n image doesn’t include OTel packages. Build a thin wrapper that adds them:

    # Dockerfile
    FROM n8nio/n8n:latest
    USER root
    RUN npm install --prefix /usr/local/lib \
    @opentelemetry/api \
    @opentelemetry/sdk-node \
    @opentelemetry/auto-instrumentations-node \
    @opentelemetry/exporter-trace-otlp-http
    # Verify the packages resolved where Node expects them
    RUN node -e "require('@opentelemetry/sdk-node'); console.log('OTel OK')"
    USER node
  4. Wire the environment variables

    # docker-compose.yml
    services:
    n8n:
    build: .
    ports:
    - "5678:5678"
    environment:
    NODE_OPTIONS: "--require /home/node/tracing.js"
    OTEL_SERVICE_NAME: "n8n"
    OTEL_EXPORTER_OTLP_ENDPOINT: "${OTEL_EXPORTER_OTLP_ENDPOINT}"
    OTEL_EXPORTER_OTLP_HEADERS: "${OTEL_EXPORTER_OTLP_HEADERS}"
    OTEL_EXPORTER_OTLP_PROTOCOL: "http/protobuf"
    OTEL_RESOURCE_ATTRIBUTES: "deployment.environment=production"
    # Suppress verbose OTel diagnostic output
    OTEL_LOG_LEVEL: "error"
    # Standard n8n config
    N8N_BASIC_AUTH_ACTIVE: "true"
    N8N_BASIC_AUTH_USER: "${N8N_USER:-admin}"
    N8N_BASIC_AUTH_PASSWORD: "${N8N_PASSWORD:-changeme}"
    WEBHOOK_URL: "http://localhost:5678/"
    volumes:
    - ./tracing.js:/home/node/tracing.js:ro
    - n8n_data:/home/node/.n8n
    volumes:
    n8n_data:

    Set credentials in a .env file:

    OTEL_EXPORTER_OTLP_ENDPOINT=https://<your-last9-otlp-endpoint>
    OTEL_EXPORTER_OTLP_HEADERS=Authorization=Basic <your-last9-auth-token>
  5. Start and verify

    docker compose up --build

    Trigger any workflow — even the manual trigger on an empty workflow generates HTTP spans. Open Traces Explorer and filter by service.name = n8n. Within a few seconds you should see:

    • POST /rest/workflows/:id/run — the execution trigger span
    • Child spans for outbound HTTP/DB calls made by individual nodes

Filtering useful spans in Last9

Once data flows, a few queries are particularly useful:

Slow workflows — p99 duration across all n8n executions:

histogram_quantile(0.99, rate(traces_span_metrics_duration_milliseconds_bucket{service_name="n8n"}[5m]))

Error rate by workflow:

rate(traces_span_metrics_calls_total{service_name="n8n", status_code="STATUS_CODE_ERROR"}[5m])

Slowest external APIs called by n8n nodes — filter by http.method and sort http.url by duration in the Traces UI.

Complete working example

A ready-to-run Docker Compose setup with Dockerfile, tracing.js, sample workflow, and a test script is available in the opentelemetry-examples repository under n8n-otel/.

Troubleshooting

No traces appear. Confirm NODE_OPTIONS=--require /home/node/tracing.js is set before n8n starts. Setting it inside a running container has no effect.

Cannot find module '@opentelemetry/sdk-node'. Packages must be installed where Node resolves global modules. Run npm root -g inside the container to confirm the path matches your npm install --prefix in the Dockerfile.

Auth header rejected. The header value must include the Authorization=Basic prefix. The full string from Last9 (including Basic ) goes into OTEL_EXPORTER_OTLP_HEADERS.

Please get in touch with us on Discord or Email if you have any questions.