TypeScript SDK - Upgrade Path v3 to v4
Please follow each section below to upgrade your application from v3 to v4.
If you encounter any questions or issues while upgrading, please raise an issue on GitHub.
Initialization
The Langfuse base URL environment variable is now LANGFUSE_BASE_URL and no longer LANGFUSE_BASEURL. For backward compatibility however, the latter will still work in v4 but not in future versions.
Tracing
The v4 SDK tracing is a major rewrite based on OpenTelemetry and introduces several breaking changes.
- OTEL-based Architecture: The SDK is now built on top of OpenTelemetry. An OpenTelemetry Setup is required now and done by registering the
LangfuseSpanProcessorwith an OpenTelemetryNodeSDK. - New Tracing Functions: The
langfuse.trace(),langfuse.span(), andlangfuse.generation()methods have been replaced bystartObservation,startActiveObservation, etc., from the@langfuse/tracingpackage. - Separation of Concerns:
- The
@langfuse/tracingand@langfuse/otelpackages are for tracing. - The
@langfuse/clientpackage and theLangfuseClientclass are now only for non-tracing features like scoring, prompt management, and datasets.
- The
See the SDK v4 docs for details on each.
Prompt Management
-
Import: The import of the Langfuse client is now:
import { LangfuseClient } from "@langfuse/client"; -
Usage: The usage of the Langfuse client is now:
const langfuse = new LangfuseClient(); const prompt = await langfuse.prompt.get("my-prompt"); const compiledPrompt = prompt.compile({ topic: "developers" }); const response = await openai.chat.completions.create({ model: "gpt-4o", messages: [{ role: "user", content: compiledPrompt }], }); -
versionis now an optional property of the options object oflangfuse.prompt.get()instead of a positional argument.const prompt = await langfuse.prompt.get("my-prompt", { version: "1.0" });
OpenAI integration
-
Import: The import of the OpenAI integration is now:
import { observeOpenAI } from "@langfuse/openai"; -
You can set the
environmentandreleasenow via theLANGFUSE_TRACING_ENVIRONMENTandLANGFUSE_TRACING_RELEASEenvironment variables.
Vercel AI SDK
Works very similarly to v3, but replaces LangfuseExporter from langfuse-vercel with the regular LangfuseSpanProcessor from @langfuse/otel.
Please see full example on usage with the AI SDK for more details.
Langchain integration
-
Import: The import of the Langchain integration is now:
import { CallbackHandler } from "@langfuse/langchain"; -
You can set the
environmentandreleasenow via theLANGFUSE_TRACING_ENVIRONMENTandLANGFUSE_TRACING_RELEASEenvironment variables.
langfuseClient.getTraceUrl
-
method is now asynchronous and returns a promise
const traceUrl = await langfuseClient.getTraceUrl(traceId);
Scoring
-
Import: The import of the Langfuse client is now:
import { LangfuseClient } from "@langfuse/client"; -
Usage: The usage of the Langfuse client is now:
const langfuse = new LangfuseClient(); await langfuse.score.create({ traceId: "trace_id_here", name: "accuracy", value: 0.9, });
See custom scores documentation for new scoring methods.
Datasets
See datasets documentation for new dataset methods.