Last9 Last9

Feb 25th, β€˜25 / 6 min read

How to Implement OpenTelemetry in NestJS

Learn how to integrate OpenTelemetry with NestJS to capture and export traces, improving observability and performance monitoring.

How to Implement OpenTelemetry in NestJS

Modern applications are becoming increasingly complex, and debugging distributed systems can feel like searching for a needle in a haystack. This is where OpenTelemetry (OTel) comes in.

If you're using NestJS, integrating OpenTelemetry can provide deep insights into your application's behavior, helping you track performance, troubleshoot issues, and understand service interactions.

This guide goes beyond the basics, covering advanced configurations, manual instrumentation, distributed tracing, and optimizing OpenTelemetry for production-grade NestJS applications.

Understanding OpenTelemetry: The Core Concepts

OpenTelemetry is an open-source observability framework that provides APIs, libraries, and tools to collect and export telemetry data (traces, metrics, and logs). It helps developers gain insights into application performance and diagnose issues in distributed systems.

Key Components of OpenTelemetry

  • Traces: Capture the journey of a request through different services.
  • Metrics: Provide insights into system performance and resource utilization.
  • Logs: Offer context-rich information that complements traces and metrics.
πŸ’‘
For a deeper understanding of how traces fit into the broader observability landscape, check out this guide on metrics, events, logs, and traces.

Why OpenTelemetry is Essential for NestJS Applications

NestJS follows a modular architecture and often integrates with microservices, making observability crucial. OpenTelemetry provides:

  • Distributed Tracing: Track how requests propagate through various components.
  • Custom Metrics Collection: Monitor system performance and detect bottlenecks.
  • Seamless Log Correlation: Improve debugging by linking logs with traces.
  • Extensibility: Supports multiple exporters like Jaeger, Prometheus, and Zipkin.

Step-by-Step Guide to Setup OpenTelemetry in a NestJS Application

1. Installing Required Dependencies

Run the following command to install OpenTelemetry packages:

npm install @opentelemetry/api @opentelemetry/sdk-node @opentelemetry/instrumentation-http @opentelemetry/instrumentation-express @opentelemetry/instrumentation-fs @opentelemetry/exporter-trace-otlp-http

2. Configuring OpenTelemetry SDK in NestJS

Create a tracing.ts file to configure and initialize OpenTelemetry:

import { NodeTracerProvider } from '@opentelemetry/sdk-trace-node';
import { BatchSpanProcessor, SimpleSpanProcessor } from '@opentelemetry/sdk-trace-base';
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';
import { registerInstrumentations } from '@opentelemetry/instrumentation';
import { HttpInstrumentation } from '@opentelemetry/instrumentation-http';
import { ExpressInstrumentation } from '@opentelemetry/instrumentation-express';

const provider = new NodeTracerProvider();
const exporter = new OTLPTraceExporter({ url: 'http://localhost:4318/v1/traces' });

provider.addSpanProcessor(new BatchSpanProcessor(exporter)); // Using BatchSpanProcessor for optimized performance
provider.register();

registerInstrumentations({
  instrumentations: [
    new HttpInstrumentation(),
    new ExpressInstrumentation(),
  ],
});

console.log('OpenTelemetry initialized');

3. Integrating OpenTelemetry in NestJS Entry Point

Import tracing.ts in your main.ts file:

import './tracing';
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  await app.listen(3000);
}
bootstrap();

4. Validating Tracing Data

Start your NestJS application:

node dist/main

Use Jaeger or another OpenTelemetry-compatible backend to visualize traces.

πŸ’‘
If you're also working with Jaeger, check out this guide on using Jaeger with OpenTelemetry for more insights.

Advanced OpenTelemetry Features in NestJS

Manually Creating and Managing Spans

In addition to automatic instrumentation, you can manually create spans to track specific operations:

import { trace } from '@opentelemetry/api';

const tracer = trace.getTracer('nestjs-advanced-app');

async function myFunction() {
  const span = tracer.startSpan('custom-operation');
  try {
    // Business logic
  } finally {
    span.end();
  }
}

Collecting and Exporting Custom Metrics

To add custom metrics, install the OpenTelemetry metrics SDK:

npm install @opentelemetry/sdk-metrics

Then, configure and register custom metrics:

import { MeterProvider } from '@opentelemetry/sdk-metrics';

const meter = new MeterProvider().getMeter('nestjs-metrics');
const requestCounter = meter.createCounter('http_requests', {
  description: 'Counts incoming HTTP requests',
});

function countRequest() {
  requestCounter.add(1);
}

Call countRequest() inside your request handlers to track API traffic.

Optimizing OpenTelemetry Performance in Production

  • Use Batch Exporters: Reduces the overhead of sending trace data.
  • Enable Sampling: Configure sampling to avoid excessive trace collection.
  • Use Efficient Storage Backends: Export traces and metrics to optimized observability platforms.
πŸ’‘
To explore how OpenTelemetry handles metrics alongside traces, check out this guide on OpenTelemetry Metrics.

Adding Tracing to NestJS Applications

To enable tracing in a NestJS application, you need to integrate a tracer that captures and propagates trace data. Here’s how you can do it step by step:

1. Setting Up a Tracer

First, create a dedicated tracer.ts file and configure the tracer:

import { NodeTracerProvider } from '@opentelemetry/sdk-trace-node';
import { BatchSpanProcessor } from '@opentelemetry/sdk-trace-base';
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';
import { trace } from '@opentelemetry/api';

const provider = new NodeTracerProvider();
const exporter = new OTLPTraceExporter({ url: 'http://localhost:4318/v1/traces' });

provider.addSpanProcessor(new BatchSpanProcessor(exporter));
provider.register();

export const tracer = trace.getTracer('nestjs-app');

2. Creating Custom Spans in NestJS Services

Once the tracer is set up, you can manually create spans to track specific operations:

import { tracer } from './tracer';

async function fetchData() {
  const span = tracer.startSpan('fetch-data');
  try {
    // Simulate data fetching
  } finally {
    span.end();
  }
}

3. Automatically Instrumenting NestJS with OpenTelemetry

To automatically instrument NestJS modules and HTTP requests, register OpenTelemetry in your main.ts:

import './tracer';
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  await app.listen(3000);
}
bootstrap();

With these steps, your NestJS application will be able to generate and export trace data, improving observability and performance monitoring.

Key Strategies to Optimize OpenTelemetry tracing

Efficient tracing is crucial to prevent unnecessary overhead and ensure smooth application performance. Here are some key strategies to optimize OpenTelemetry tracing in NestJS applications:

1. Use Sampling to Reduce Overhead

Instead of tracing every request, implement a sampling strategy to capture only a percentage of traces:

import { ParentBasedSampler, TraceIdRatioBasedSampler } from '@opentelemetry/sdk-trace-base';

const provider = new NodeTracerProvider({
  sampler: new ParentBasedSampler({
    root: new TraceIdRatioBasedSampler(0.1), // 10% sampling rate
  }),
});

2. Optimize Exporter Configuration

Batch processing reduces the number of network requests for exporting traces:

provider.addSpanProcessor(new BatchSpanProcessor(exporter, {
  maxQueueSize: 500,
  scheduledDelayMillis: 5000,
}));

3. Avoid Tracing Unnecessary Endpoints

Filter out health checks or static file requests to prevent unnecessary trace collection:

const httpInstrumentation = new HttpInstrumentation({
  ignoreIncomingRequestHook: (req) => req.url.includes('/health') || req.url.includes('/static'),
});

4. Use Efficient Observability Platforms

Export trace data to optimized backends like Last9, Grafana Tempo, Lightstep, or Honeycomb, which offer better performance and scalability.

πŸ’‘
If you're evaluating observability platforms, you might find this list of top Datadog alternatives for 2025 useful.

Exporting OpenTelemetry Data to Observability Platforms

You can send OpenTelemetry data to platforms like:

  • Jaeger: http://localhost:14268/api/traces
  • Prometheus: http://localhost:9090
  • Grafana Tempo: http://tempo:4317
  • Last9 / Datadog / Honeycomb: Configurable via API endpoints

Modify the OTLPTraceExporter URL in tracer.ts accordingly.

How to Export OpenTelemetry Data to Last9

Here's how you can do that:

Step 1: Install Dependencies

Ensure you have the necessary OpenTelemetry libraries installed. For a Node.js application, run:

npm install @opentelemetry/sdk-node @opentelemetry/exporter-trace-otlp-http

For a Python application, use:

pip install opentelemetry-sdk opentelemetry-exporter-otlp

Step 2: Obtain Last9 OTLP Endpoint and Credentials

  1. Create a Last9 Cluster: Follow the Getting Started guide to set up your cluster.
  2. Retrieve Endpoint and Credentials: After setting up, note the following:
    • Remote Write URL: $levitate_remote_write_url
    • Cluster ID: $levitate_remote_write_username
    • Write Token: $levitate_remote_write_password

Step 3: Configure OpenTelemetry in Your Application

Modify your tracing configuration to export traces to Last9.

For Node.js (tracing.js):

const { NodeSDK } = require('@opentelemetry/sdk-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-http');
const { getNodeAutoInstrumentations } = require('@opentelemetry/auto-instrumentations-node');

const traceExporter = new OTLPTraceExporter({
  url: 'https://ingest.last9.io/v1/traces', // Replace with your Last9 endpoint
  headers: {
    'Authorization': 'Bearer YOUR_LAST9_WRITE_TOKEN', // Replace with your write token
  },
});

const sdk = new NodeSDK({
  traceExporter,
  instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();

For Python:

from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter

trace.set_tracer_provider(TracerProvider())
tracer = trace.get_tracer(__name__)

otlp_exporter = OTLPSpanExporter(
    endpoint="https://ingest.last9.io/v1/traces", # Replace with your Last9 endpoint
    headers=(("Authorization", "Bearer YOUR_LAST9_WRITE_TOKEN"),), # Replace with your write token
)

span_processor = BatchSpanProcessor(otlp_exporter)
trace.get_tracer_provider().add_span_processor(span_processor)

Step 4: Start Collecting and Sending Traces

With the configuration in place, your application will automatically collect traces and send them to Last9.

Step 5: Verify Data in Last9

  1. Log in to Last9: Access your Last9 dashboard.
  2. Navigate to Traces: Go to the Traces section to view incoming data.

Step 6: Optimize with Last9's Insights

Use Last9's features to:

  • Set Up Alerts: Monitor anomalies in your system.
  • Analyze Performance: Identify and resolve latency issues.
  • Correlate Data: Link traces with logs and metrics for comprehensive insights.

For more detailed information, refer to Last9's Traces Documentation.

Conclusion

Setting up OpenTelemetry tracing with Last9 streamlines observability, offering real-time insights into system performance.

With proper configuration, you can track latency, detect anomalies, and correlate data across logs and metrics.

Last9’s analytics provide a scalable way to optimize infrastructure and enhance monitoring. Start integrating today for deeper visibility into your applications.

πŸ’‘
And if you have any questions, join our Discord community! There's a dedicated channel where you can connect with other developers and discuss your specific use case.

FAQs

1. What benefits does OpenTelemetry provide for NestJS applications?

It offers distributed tracing, custom metrics collection, and improved debugging capabilities, essential for maintaining scalable NestJS applications.

2. How can I view OpenTelemetry traces from my NestJS app?

You can visualize traces using Jaeger, Zipkin, Grafana Tempo, or any OpenTelemetry-compatible backend.

3. Can OpenTelemetry be used with NestJS microservices?

Yes, OpenTelemetry supports NestJS microservices by instrumenting gRPC, Kafka, RabbitMQ, and Redis-based communication.

4. What impact does OpenTelemetry have on application performance?

While OpenTelemetry introduces minimal overhead, using optimizations like batch processing and sampling can mitigate performance concerns.

Contents


Newsletter

Stay updated on the latest from Last9.

Authors
Aditya Godbole

Aditya Godbole

CTO at Last9