Langfuse SDKs
The Langfuse SDKs are the recommended way to integrate with Langfuse for custom instrumentation.
Properties:
- Based on OpenTelemetry, so you can use any OTEL-based instrumentation library for your LLM.
- Fully async requests, using Langfuse adds almost no latency
- Accurate latency tracking using synchronous timestamps
- IDs available for downstream use
- Great DX when nesting observations
- Cannot break your application, all errors are caught and logged
- Interoperable with Langfuse integrations
Python
Please see our Python SDK documentation on how to get started.
JS/TS
The Langfuse SDK is designed to be modular. Here’s an overview of the available packages:
Package | Description | Environment |
---|---|---|
@langfuse/core | Core utilities, types, and logger shared across packages. | Universal JS |
@langfuse/client | Client for features like prompts, datasets, and scores. | Universal JS |
@langfuse/tracing | Core OpenTelemetry-based tracing functions (startObservation , etc.). | Universal JS |
@langfuse/otel | The LangfuseSpanProcessor to export traces to Langfuse. | Node.js ≥ 20 |
@langfuse/openai | Automatic tracing integration for the OpenAI SDK. | Universal JS |
@langfuse/langchain | CallbackHandler for tracing LangChain applications. | Universal JS |
Please see our TypeScript SDK documentation on how to get started.
Other Languages
Via the public API, you can integrate with Langfuse from any language.
For tracing, use the OpenTelemetry SDK of your choice and send traces to the Langfuse OTel Endpoint.
Was this page helpful?