Langfuse for Enterprise

Langfuse addresses key challenges when deploying LLM-based applications within an enterprise. As an open source project, Langfuse is the ideal platform to address data security and privacy concerns by self-hosting. This document outlines common queries related to using Langfuse in an enterprise setting.

Langfuse is licensed as an open-core project. It’s core product (tracing, observability, evals, prompt management and API/SDK endpoints) is MIT-licensed and freely available (also without limitation for commercial use).


Your platform for enterprise-ready LLM applications

Upgrade to Langfuse Enterprise Edition to address high security requirements and needs of platform teams to support billions of traces per month. With an Enterprise Agreement, you get access to additional support, features to address your specific needs and compliance requirements.

Please refer to our feature overview and the Enterprise Edition FAQ here.

Langfuse is trusted by the best teams building LLM applications and AI agents and supported by the largest open source community:


Select Resources


Talk to us

Ready to get started with Langfuse Enterprise? We’re here to help you find the right solution for your team.


Introduction to Langfuse

10 minute demo of all Langfuse features


Resources


Resellers and managed Langfuse deployments

We partner with Shakudo who operates Langfuse Enterprise self-hosted on behalf of customers on their VPC.

In Japan & APAC, GAO is a partner company based in Tokyo that provides professional services, technical support, and payment in local currency in Japan.

We are happy to introduce prospects to our contact at Shakudo and GAO for more information on self-hosted / managed self-hosting.


Any questions?

We’re here to help you find the right solution for your use case.


FAQ

We collect the most common questions and answers here. If you have questions that are not answered, please reach out to us: [email protected]

What deployment options are available for Langfuse?
  1. Managed Cloud (cloud.langfuse.com), see Pricing and Security page for details.
  2. Self-hosted on your own infrastructure. Contact us if you are interested in additional support.
What is the difference between Langfuse Cloud and the open-source version?

The Langfuse team provides Langfuse Cloud as a managed solution to simplify the initial setup of Langfuse and to minimize the operational overhead of maintaining high availability in production. You can chose to self-host Langfuse on your own infrastructure.

Some features are not available in the open-source version. Please refer to the overview here.

How does Authentication and RBAC work in Langfuse?

Langfuse offers a list of prebuilt roles which apply on an organizational and project level to restrict access (RBAC documentation).

If needed, environments (production, staging, development) can be separated into different projects in Langfuse to restrict access to production/sensitive data while making it easier to share development environments with the team and other stakeholders.

SSO with Langfuse is simple. Currently Google, GitHub, Azure AD, Okta, Auth0, and AWS Cognito are supported. We can easily add additional providers based on your requirements. As an enterprise customer, you can also enforce SSO for your organization.

What is the easiest way to try Langfuse?

The Hobby Plan on Langfuse Cloud includes enough resources to try Langfuse for free while in a non-production environment, no credit card required.

Alternatively, you can quickly spin up Langfuse on your own machine using docker compose up (docs).

If you require security and compliance features to run a POC, please reach out to us at [email protected].

Common Enterprise LLM Platform Architecture

Langfuse aims to address the challenges of debugging, monitoring, and continuously improving LLM-based applications. It focuses on observability, evaluation, and prompt management.

Langfuse is often deployed alongside a central LLM Gateway that provides schema translation, rate limiting, and PII redaction. The LLM Gateway can be an internal service or an open-source project like LiteLLM. If you use LiteLLM, you can leverage the native integration (docs).

Was this page helpful?