TypeScript SDK - Upgrade Path v3 to v4
Please follow each section below to upgrade your application from v3 to v4.
If you encounter any questions or issues while upgrading, please raise an issue on GitHub.
Tracing
The v4 SDK tracing is a major rewrite based on OpenTelemetry and introduces several breaking changes.
- OTEL-based Architecture: The SDK is now built on top of OpenTelemetry. An OpenTelemetry Setup is required now and done by registering the
LangfuseSpanProcessor
with an OpenTelemetryNodeSDK
. - New Tracing Functions: The
langfuse.trace()
,langfuse.span()
, andlangfuse.generation()
methods have been replaced bystartObservation
,startActiveObservation
, etc., from the@langfuse/tracing
package. - Separation of Concerns:
- The
@langfuse/tracing
and@langfuse/otel
packages are for tracing. - The
@langfuse/client
package and theLangfuseClient
class are now only for non-tracing features like scoring, prompt management, and datasets.
- The
See the SDK v4 docs for details on each.
Prompt Management
-
Import: The import of the Langfuse client is now:
import { LangfuseClient } from "@langfuse/client";
-
Usage: The usage of the Langfuse client is now:
const langfuse = new LangfuseClient(); const prompt = await langfuse.prompt.get("my-prompt"); const compiledPrompt = prompt.compile({ topic: "developers" }); const response = await openai.chat.completions.create({ model: "gpt-4o", messages: [{ role: "user", content: compiledPrompt }], });
-
version
is now an optional property of the options object oflangfuse.prompt.get()
instead of a positional argument.const prompt = await langfuse.prompt.get("my-prompt", { version: "1.0" });
OpenAI integration
-
Import: The import of the OpenAI integration is now:
import { observeOpenAI } from "@langfuse/openai";
-
You can set the
environment
andrelease
now via theLANGFUSE_TRACING_ENVIRONMENT
andLANGFUSE_TRACING_RELEASE
environment variables.
Vercel AI SDK
Works very similarly to v3, but replaces LangfuseExporter
from langfuse-vercel
with the regular LangfuseSpanProcessor
from @langfuse/otel
.
Please see full example on usage with the AI SDK for more details.
Langchain integration
-
Import: The import of the Langchain integration is now:
import { CallbackHandler } from "@langfuse/langchain";
-
You can set the
environment
andrelease
now via theLANGFUSE_TRACING_ENVIRONMENT
andLANGFUSE_TRACING_RELEASE
environment variables.
langfuseClient.getTraceUrl
-
method is now asynchronous and returns a promise
const traceUrl = await langfuseClient.getTraceUrl(traceId);
Scoring
-
Import: The import of the Langfuse client is now:
import { LangfuseClient } from "@langfuse/client";
-
Usage: The usage of the Langfuse client is now:
const langfuse = new LangfuseClient(); await langfuse.score.create({ traceId: "trace_id_here", name: "accuracy", value: 0.9, });
See custom scores documentation for new scoring methods.
Datasets
See datasets documentation for new dataset methods.