June 4, 2025

Doubling Down on Open Source

Today we are open sourcing all remaining Product Features in Langfuse under the MIT license

The LLM landscape is changing rapidly and so are the workflows used to build and improve LLM apps. Today, we are open sourcing all Product Features in Langfuse to enable our community to iterate on their applications faster and provide our project with feedback on where to go next.

Newly open sourced features include managed LLM-as-a-judge evaluations, annotation queues, prompt experiments and the playground, all of which are now freely available to self-host under the MIT license.

OSS Change Comparison

If you are self-hosting Langfuse already today, upgrade your deployment to the latest version. You will find a more powerful version of Langfuse.

Why are we doing this?

Langfuse is building the open source LLM Engineering Platform. We are building the platform to observe and improve LLM applications.

We are constantly shipping to be the technology of choice for our community. This requires trust, feedback and buy-in from our community.

We re-visited the gated features in our Enterprise Edition. If we want to be the first choice in the market, we need to allow our community to cover the entire dev cycle in our FOSS version. Features like LLM-as-a-Judge, Evals, or our Playground are market standard at this point and should be freely available. But why stop there?

The best platform for developers has to be open at its core. By removing commercial barriers from our product features, we’re fostering deeper trust, collaborating on contributions, accelerating adoption, gathering richer community feedback, and iterating faster than ever.

Our Open Source Journey

Langfuse was launched as an open source project. This was based on a few core beliefs:

  • Data captured by Langfuse should be freely accessible
  • The AI landscape changes every week – Langfuse must integrate agnostically with all models and application stacks
  • Great teams deserve the flexibility to extend the platform to support custom workflows

This positioning resonates and today we are doubling down on it.

Langfuse has been an open core company from the start. Our core just expanded significantly and our periphery shrunk. Commercially licensed code is now limited to features for Enterprise Security and Platform Teams (e.g. SCIM, Audit Logs, Data Retention Policies – regular SSO is and continues to be MIT licensed).

Langfuse’s Open Core Model

AreaThenNow
Core PlatformOSS (MIT)OSS (MIT) ✅
LLM-as-a-Judge EvaluationsCommercialOSS (MIT) ✅
PlaygroundCommercialOSS (MIT) ✅
Prompt ExperimentsCommercialOSS (MIT) ✅
Annotation/Data LabelingCommercialOSS (MIT) ✅
Enterprise SecurityCommercialCommercial 🔒
Enterprise SupportCommercialCommercial 🔒

As we open source previously commercial features, we are fully investing our commercial focus in Langfuse Cloud and Enterprise platform teams in self-hosting.

Thousands of Langfuse Deployments

Besides >7,000,000 monthly SDK installs and >5,500,000 Docker pulls, there are >8,000 monthly active self-hosted instances of Langfuse out in the wild. This boggles our minds.

We expect that today’s changes solidify Langfuse as the first choice for a powerful and truly open source platform in LLMOps.

Today is the day to start self-hosting Langfuse. Head over to our self hosting docs and use the new terraform modules for deploying Langfuse at production-scale.

Some additional public metrics:

Open dashboard in new tab ↗

How can you follow and contribute to Langfuse?

Was this page useful?

Questions? We're here to help

Subscribe to updates