IntegrationsModel ProvidersAnthropic (JS/TS)
This is a Deno notebook

Trace Anthropic JS/TS with Langfuse

Python JS/TS

Anthropic provides advanced language models like Claude, known for their safety, helpfulness, and strong reasoning capabilities. By combining Anthropic’s JS/TS SDK with Langfuse, you can trace, monitor, and analyze your AI workloads in development and production.

This notebook demonstrates how to use the AnthropicInstrumentation library from OpenInference to automatically instrument Anthropic SDK calls and send OpenTelemetry spans to Langfuse.

What is Anthropic?
Anthropic is an AI safety company that develops Claude, a family of large language models designed to be helpful, harmless, and honest. Claude models excel at complex reasoning, analysis, and creative tasks.

What is Langfuse?
Langfuse is an open source platform for LLM observability and monitoring. It helps you trace and monitor your AI applications by capturing metadata, prompt details, token usage, latency, and more.

Step 1: Install Dependencies

Install the necessary packages:

npm install @anthropic-ai/sdk @arizeai/openinference-instrumentation-anthropic @langfuse/otel @opentelemetry/sdk-node

Note: This cookbook uses Deno.js for execution, which requires different syntax for importing packages and setting environment variables. For Node.js applications, the setup process is similar but uses standard npm packages and process.env.

Step 2: Configure Environment

Set up your Langfuse and Anthropic API keys. You can get Langfuse keys by signing up for a free Langfuse Cloud account or by self-hosting Langfuse. Get your Anthropic API key from the Anthropic Console.

// Set environment variables using Deno-specific syntax
Deno.env.set("ANTHROPIC_API_KEY", "sk-ant-...");
 
// Langfuse authentication keys
Deno.env.set("LANGFUSE_PUBLIC_KEY", "pk-lf-...");
Deno.env.set("LANGFUSE_SECRET_KEY", "sk-lf-...");
 
// Langfuse host configuration
Deno.env.set("LANGFUSE_BASE_URL", "https://cloud.langfuse.com"); // 🇪🇺 EU region
// Deno.env.set("LANGFUSE_BASE_URL", "https://us.cloud.langfuse.com"); // 🇺🇸 US region

Step 3: Initialize OpenTelemetry with Langfuse

Set up the OpenTelemetry SDK with the LangfuseSpanProcessor and the AnthropicInstrumentation from OpenInference. The instrumentation automatically captures Anthropic SDK calls and sends them as OpenTelemetry spans to Langfuse.

import { NodeSDK } from "npm:@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "npm:@langfuse/otel";
import { AnthropicInstrumentation } from "npm:@arizeai/openinference-instrumentation-anthropic";
 
import Anthropic from "npm:@anthropic-ai/sdk";
 
// Configure the instrumentation for the Anthropic SDK
const instrumentation = new AnthropicInstrumentation();
instrumentation.manuallyInstrument(Anthropic);
 
// Initialize the OpenTelemetry SDK with Langfuse as the span processor
const sdk = new NodeSDK({
  spanProcessors: [new LangfuseSpanProcessor()],
  instrumentations: [instrumentation],
});
 
sdk.start();

Step 4: Use the Anthropic SDK

Now use the Anthropic SDK as you normally would. All calls are automatically traced and sent to Langfuse.

const anthropic = new Anthropic();
 
const message = await anthropic.messages.create({
  model: "claude-haiku-4-5",
  max_tokens: 1000,
  messages: [{ role: "user", content: "Hello, Claude!" }],
});
 
console.log(message.content);
 
await sdk.shutdown();

View Traces in Langfuse

After running the application, navigate to your Langfuse Trace Table. You will find detailed traces of the application’s execution, providing insights into the LLM calls, inputs, outputs, and performance metrics.

Langfuse Trace

Example trace in the Langfuse UI

Was this page helpful?