LiveKit Agents Tracing Integration
This guide shows you how to integrate Langfuse with LiveKit Agents for observability and tracing of real-time voice AI applications. By following these steps, you’ll be able to monitor, debug and evaluate your LiveKit agents in the Langfuse dashboard.
LiveKit Agents (repo) is an open-source Python and Node.js framework for building production-grade multimodal and voice AI agents. It provides a complete set of tools and abstractions for feeding realtime media through AI pipelines, supporting both high-performance STT-LLM-TTS voice pipelines and speech-to-speech models with any AI provider.
Example of a LiveKit agent conversation with telemetry in Langfuse.
Features
A trace represents the execution flow of a single request within an LLM application. It captures all relevant steps, including duration and metadata.
Agent telemetry records traces for the following activities:
- Session start
- Agent turn
- LLM node
- Function tool
- TTS node
- End-of-turn detection
- LLM and TTS metrics
Learn more about LiveKit’s built-in telemetry in the LiveKit documentation.
End-to-end example
We’ve created an end-to-end example of how to trace a LiveKit agent with Langfuse. You can find the code on GitHub.
Get Started
LiveKit Agents includes built-in OpenTelemetry support, and Langfuse provides an OpenTelemetry endpoint. Follow these steps to enable comprehensive tracing for your LiveKit application.
Obtain Langfuse API keys
Create a project in Langfuse Cloud or self-host Langfuse and copy your API keys.
Environment Configuration
LANGFUSE_SECRET_KEY = "sk-lf-..."
LANGFUSE_PUBLIC_KEY = "pk-lf-..."
LANGFUSE_HOST = "https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_HOST = "https://us.cloud.langfuse.com" # 🇺🇸 US region
Enable telemetry
To enable telemetry, configure a tracer provider using set_tracer_provider
in your entrypoint function.
Set the required public key, secret key, and host as environment variables.
import base64
import os
from livekit.agents.telemetry import set_tracer_provider
def setup_langfuse(
host: str | None = None, public_key: str | None = None, secret_key: str | None = None
):
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
public_key = public_key or os.getenv("LANGFUSE_PUBLIC_KEY")
secret_key = secret_key or os.getenv("LANGFUSE_SECRET_KEY")
host = host or os.getenv("LANGFUSE_HOST")
if not public_key or not secret_key or not host:
raise ValueError("LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, and LANGFUSE_HOST must be set")
langfuse_auth = base64.b64encode(f"{public_key}:{secret_key}".encode()).decode()
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = f"{host.rstrip('/')}/api/public/otel"
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = f"Authorization=Basic {langfuse_auth}"
trace_provider = TracerProvider()
trace_provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter()))
set_tracer_provider(trace_provider)
async def entrypoint(ctx: JobContext):
setup_langfuse() # set up the langfuse tracer provider
# ...