DocsObservabilitySDKsJS/TSOverview
Version: JS SDK v4

TypeScript SDK - Overview

The modular Langfuse TypeScript SDK (v4) is built on OpenTelemetry for robust observability, better context management, and easy integration with third-party libraries.

This documentation is for the TypeScript SDK v4. If you are looking for the documentation for the TypeScript SDK v3, please click here.

If you are self-hosting Langfuse, the TypeScript SDK v4 requires Langfuse platform version ≥ 3.95.0 for all features to work correctly.

Quickstart

Get your first trace into Langfuse in just a few minutes.

Install packages

Install the relevant packages to get started with tracing:

npm install @langfuse/tracing @langfuse/otel @opentelemetry/sdk-node

Learn more about the packages here.

Set up environment variables

Add your Langfuse credentials to your environment variables. Make sure that you have a .env file in your project root and a package like dotenv to load the variables.

.env
LANGFUSE_SECRET_KEY = "sk-lf-..."
LANGFUSE_PUBLIC_KEY = "pk-lf-..."
LANGFUSE_BASE_URL = "https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_BASE_URL = "https://us.cloud.langfuse.com" # 🇺🇸 US region

Set up OpenTelemetry

Create a file named instrumentation.ts to initialize the OpenTelemetry SDK. The LangfuseSpanProcessor is the key component that sends traces to Langfuse.

instrumentation.ts
import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
 
const sdk = new NodeSDK({
  spanProcessors: [new LangfuseSpanProcessor()],
});
 
sdk.start();

Import the instrumentation.ts file at the very top of your application’s entry point (e.g., index.ts).

index.ts
import "./instrumentation"; // Must be the first import

Learn more about setting up OpenTelemetry here.

Instrument your application

Use one of the native Langfuse framework integrations to automatically trace your application.

Alternatively, manually instrument your application, e.g. by using the startActiveObservation. This function takes a callback and automatically manages the observation’s lifecycle and the OpenTelemetry context. Any observation created inside the callback will automatically be nested under the active observation, and the observation will be ended when the callback finishes.

This is just an example, check out the instrumentation page for more details.

index.ts
import "./instrumentation";
import { startActiveObservation } from "@langfuse/tracing";
 
async function main() {
  await startActiveObservation("my-first-trace", async (span) => {
    span.update({
      input: "Hello, Langfuse!",
      output: "This is my first trace!",
    });
  });
}
 
main();

Run your application

Execute your application. You should see your trace appear in the Langfuse UI.

npx tsx index.ts

Packages

The Langfuse SDK is designed to be modular. Here’s an overview of the available packages:

PackageDescriptionEnvironment
@langfuse/coreCore utilities, types, and logger shared across packages.Universal JS
@langfuse/clientClient for features like prompts, datasets, and scores.Universal JS
@langfuse/tracingCore OpenTelemetry-based tracing functions (startObservation, etc.).Universal JS
@langfuse/otelThe LangfuseSpanProcessor to export traces to Langfuse.Node.js ≥ 20
@langfuse/openaiAutomatic tracing integration for the OpenAI SDK.Universal JS
@langfuse/langchainCallbackHandler for tracing LangChain applications.Universal JS

OpenTelemetry foundation

Building on OpenTelemetry is a core design choice for this SDK. It offers several key advantages:

  • Standardization: It aligns with the industry standard for observability, making it easier to integrate with existing monitoring and APM tools.
  • Robust Context Management: OpenTelemetry provides reliable context propagation, ensuring that traces are correctly linked even in complex, asynchronous applications.
  • Ecosystem & Interoperability: You can leverage a vast ecosystem of third-party instrumentations. If a library you use supports OpenTelemetry, its traces can be sent to Langfuse automatically.

Learn more

Was this page helpful?