DocsObservabilityOverview

Observability & Tracing

Langfuse is a platform for observability and tracing of LLM applications and agents. It captures everything happening during an LLM interaction: inputs, outputs, tool usage, retries, latencies and costs.

Langfuse is based on OpenTelemetry to allow for easy integration with any OpenTelemetry-based instrumentation library.

Tracing Overview
🎥

Watch this walkthrough of Langfuse Observability and how to integrate it with your application.

Why Tracing?

  • Faster development & debugging, understand how the LLM/agent is behaving via error analysis
  • Cost tracking for each model call
  • Reduce application latency
  • Set foundation for evaluations
  • Save hours in customer support

Why Trace with Langfuse?

  • Open Source: Langfuse is the fastest growing open source LLM observability platform.
  • Multi-Modal: Langfuse supports text, images, audio, and more.
  • Multi-Model: Langfuse supports all major LLM providers.
  • Framework Agnostic: Langfuse supports LangChain, OpenAI, LlamaIndex, and more.
  • Language Agnostic: Langfuse supports Python, JavaScript, and more.

Where to start

Follow the quickstart to add tracing to your LLM app.

Core Features

Essential tracing features that form the foundation of observability in Langfuse:

Advanced Features

Extended capabilities for sophisticated tracing and analysis:

Was this page helpful?