Langfuse just got faster →

Used by 19 of Fortune 50

10+ billion observations/month

100,000+ engineers building on Langfuse

Used by 19 of Fortune 50

10+ billion observations/month

100,000+ engineers building on Langfuse

Open Source LLMEngineeringPlatform

Debug AI Applications and Agents in minutes. Spot issues before your users do. Collaborate with your team to continuously improve on cost, latency and quality.

Canva logoKhan Academy logo

Gain deep visibility into your traces

Launch, observe, improve — repeat.

Langfuse helps you ship AI Agents/Products from prototype to production and beyond. Once in production we power your continous improvement loop using production data to make your agents and LLM applications ever more powerful.

Langfuse platform overview

All the tools, oneintegrated platform.

One integrated platform to trace, manage prompts, evaluate, and experiment from prototype to production scale.

Works with any stack.

Langfuse works with any language and framework supporting OTel instrumentation. Additionally, 80+ integrations make getting started even easier. No framework lock-in.

Open Platform. Open Source.

We are huge fans of open standards and data portability. Langfuse won't lock in your data, ever.

Made for developers, loved by agents.

Langfuse works by default with your coding agents. Install our MCP servers and CLI to develop at the speed of thought. Let Claude Code, Cursor and Codex do the hard work.

SKILL.md

A ready-made skill for your coding agent. Manage prompts, traces, and evals through natural language — no manual API calls needed.

Langfuse CLI

Full API access from the terminal. Let coding agents manage Langfuse for you, or script your workflows in CI/CD.

Platform MCP Server

Interact with your Langfuse data programmatically from your IDE. Manage prompts, query traces, and more.

Enterprise Scaleand Security.

Traditional observability handles many small spans. LLM systems run differently. Every step carries rich, verbose I/O that legacy platforms can't handle at scale. Langfuse ingests and queries LLM traces reliably at enterprise scale while following strict compliance frameworks.

Canva

Canva's AI team relies on Langfuse to trace and debug their generative design features in production.

Why use Langfuse?

Langfuse is the most widely adopted open-source LLM engineering platform. Developers who value open-source and control over their data build production grade agents and LLM applications with Langfuse.

Langfuse powers the entire development cycle from prototype to full scale production loads.

Get Started Free tier: 50k observations/month. No credit card required.

Start improvingyour agents
in under 5 minutes.

Claude CodeClaude CodeCursorCursorCodexCodex
or any other agent

Langfuse is an open-source LLM engineering platform that helps teams build, monitor, and improve their AI applications. It covers the full development lifecycle with tracing, prompt management, evaluations, and analytics dashboards — all in one place. Langfuse is used by 2,300+ companies and processes billions of observations per month. You can try it instantly with the public demo project or sign up for free