DocsObservabilityOverview

Observability & Tracing

Langfuse is a platform for observability and tracing of LLM applications. It captures everything happening during an LLM interaction: inputs, outputs, tool usage, retries, latencies and costs.

Observability is the broader concept of understanding what is happening under the hood of you LLM application. Traces are the Langfuse object used to achieve deep observability.

Tracing Overview

Why Tracing?

  • Faster development & debugging, understand how the LLM/agent is behaving
  • Cost tracking for each model call
  • Reduce application latency
  • Set foundation for evaluations
  • Save hours in customer support

Why Trace with Langfuse?

  • Open Source: Langfuse is the fastest growing open source LLM observability platform.
  • Multi-Modal: Langfuse supports text, images, audio, and more.
  • Multi-Model: Langfuse supports all major LLM providers.
  • Framework Agnostic: Langfuse supports LangChain, OpenAI, LlamaIndex, and more.
  • Language Agnostic: Langfuse supports Python, JavaScript, and more.

Where to start

Follow the quickstart to add tracing to your LLM app.

Core Features

Essential tracing features that form the foundation of observability in Langfuse:

Advanced Features

Extended capabilities for sophisticated tracing and analysis:

Was this page helpful?