Langfuse in the Enterprise

Langfuse addresses key challenges when deploying LLM-based applications within an enterprise. As an open source project, Langfuse is the ideal platform to address data security and privacy concerns by self-hosting. This document outlines common queries related to using Langfuse in an enterprise setting.

Langfuse is licensed as an open-core project. It’s core product (tracing, observability, evals, prompt management and API/SDK endpoints) is MIT-licensed and freely available (also without limitation for commercial use). Some Langfuse features on the periphery of the core product are not available in the open-source version and cannot be used out of the box. As of today, these are either a) quality of life features (such as LLM-as-a-judge service, prompt playground) or b) security & compliance features (e.g. SSO enforcement, data retention)

Please refer to our feature overview and the Enterprise Edition FAQ here. Please reach out to [email protected] to discuss an enterprise license (self-hosted or cloud) for your team.

Select Resources

Talk to us

Reach out via email ([email protected]) or schedule a demo call to discuss your specific needs and requirements.

Introduction to Langfuse

Introducing Langfuse 2.0

Langfuse features along the development lifecycle

Langfuse Features along the development lifecycle

Resources

Resellers and managed Langfuse deployments

We partner with Shakudo who operates Langfuse Enterprise self-hosted on behalf of customers on their VPC. We are happy to introduce prospects to our contact at Shakudo for more information on managed self-hosting.

FAQ

We collect the most common questions and answers here. If you have questions that are not answered, please reach out to us: [email protected]

What deployment options are available for Langfuse?

  1. Managed Cloud (cloud.langfuse.com), see Pricing and Security page for details.
  2. Self-hosted on your own infrastructure. Contact us if you are interested in additional support. Note that some of the infrastructure requirements will change with Langfuse v3.

What is the difference between Langfuse Cloud and the open-source version?

The Langfuse team provides Langfuse Cloud as a managed solution to simplify the initial setup of Langfuse and to minimize the operational overhead of maintaining high availability in production. You can chose to self-host Langfuse on your own infrastructure.

Some features are not available in the open-source version. Please refer to the overview here.

How does Authentication and RBAC work in Langfuse?

Langfuse offers a list of prebuilt roles which apply on an organizational and project level to restrict access (RBAC documentation).

If needed, environments (production, staging, development) can be separated into different projects in Langfuse to restrict access to production/sensitive data while making it easier to share development environments with the team and other stakeholders.

SSO with Langfuse is simple. Currently Google, GitHub, Azure AD, Okta, Auth0, and AWS Cognito are supported. We can easily add additional providers based on your requirements. As an enteprise customer, you can also enforce SSO for your organization.

What is the easiest way to try Langfuse?

The Hobby Plan on Langfuse Cloud includes enough resources to try Langfuse for free while in a non-production environment, no credit card required.

Alternatively, you can quickly spin up Langfuse on your own machine using docker compose up (docs).

If you require security and compliance features to run a POC, please reach out to us at [email protected].

Common Enterprise LLM Platform Architecture

Langfuse aims to address the challenges of debugging, monitoring, and continuously improving LLM-based applications. It focuses on observability, evaluation, and prompt management.

Langfuse is often deployed alongside a central LLM Gateway that provides schema translation, rate limiting, and PII redaction. The LLM Gateway can be an internal service or an open-source project like LiteLLM. If you use LiteLLM, you can leverage the native integration (docs).

Talk to us

Reach out via email ([email protected]) or schedule a demo to discuss your specific needs and requirements.

Was this page useful?

Questions? We're here to help

Subscribe to updates