Langfuse in the Enterprise

Langfuse addresses key challenges when deploying LLM-based applications within an enterprise. Being open source, Langfuse is ideal for enterprises to address data security and privacy concerns by self-hosting. This document outlines common queries related to using Langfuse in an enterprise setting.

Langfuse is licensed as an open-core project. It's core tracing features are MIT-licensed and freely available (also for commercial use). Some Langfuse features on the periphery are not available in the open-source version and cannot be used out of the box. Please refer to the Enterprise Edition FAQ here. Please reach out to [email protected] to discuss an enterprise license (self-hosted or cloud) for your team. Enterprise licenses start at $500/month.

Select Introductory Resources

Introduction to Langfuse

Introducing Langfuse 2.0

Langfuse features along the development lifecycle

Langfuse Features along the development lifecycle



We collect the most common questions and answers here. If you have questions that are not answered, please reach out to us: [email protected]

What deployment options are available for Langfuse?

  1. Managed Cloud (, see Pricing and Security page for details.
  2. Self-hosted on your own infrastructure. Contact us if you are interested in additional support. Note that some of the infrastructure requirements will change with Langfuse v3 (opens in a new tab).

What is the difference between Langfuse Cloud and the open-source version?

The Langfuse team provides Langfuse Cloud as a managed solution to simplify the initial setup of Langfuse and to minimize the operational overhead of maintaining high availability in production. You can chose to self-host Langfuse on your own infrastructure.

Some features are not available in the open-source version. Please refer to the Enterprise Edition FAQ here.

How does Authentication and RBAC work in Langfuse?

Langfuse offers a list of prebuilt roles which apply on a project level to restrict access (RBAC documentation).

If needed, environments (production, staging, development) can be separated into different projects in Langfuse to restrict access to production/sensitive data while making it easier to share development environments with the team and other stakeholders.

SSO with Langfuse is simple. Currently Google, GitHub, Azure AD, Okta, Auth0, and AWS Cognito are supported. We can easily add additional providers based on your requirements. As an enteprise customer, you can also enforce SSO for your organization.

What is the easiest way to try Langfuse?

The Hobby Plan on Langfuse Cloud (opens in a new tab) includes enough resources to try Langfuse for free while in a non-production environment, no credit card required.

Alternatively, you can quickly spin up Langfuse on your own machine using docker compose up (docs).

If you require security and compliance features to run a POC, please reach out to us at [email protected].

Common Enterprise LLM Platform Architecture

Langfuse aims to address the challenges of debugging, monitoring, and continuously improving LLM-based applications. It focuses on observability, evaluation, and prompt management.

Langfuse is often deployed alongside a central LLM Gateway that provides schema translation, rate limiting, and PII redaction. The LLM Gateway can be an internal service or an open-source project like LiteLLM. If you use LiteLLM, you can leverage the native integration (docs).

Talk to us

Schedule an introduction call to discuss your specific needs and requirements.

Was this page useful?

Questions? We're here to help

Subscribe to updates