Docs
Integrations
Overview

Integration Overview

Integrate your application with Langfuse to explore production traces and metrics.

Objective:

  1. Capture traces of your application
  2. Add scores to these traces to measure/evaluate quality of outputs

There are currently five main ways to integrate with Langfuse:

Main Integrations

IntegrationSupportsDescription
SDKPython, JS/TSManual instrumentation using the SDKs for full flexibility.
OpenAIPython, JS/TSAutomated instrumentation using drop-in replacement of OpenAI SDK.
LangchainPython, JS/TSAutomated instrumentation by passing callback handler to Langchain application.
LlamaIndexPythonAutomated instrumentation via LlamaIndex callback system.
APIDirectly call the public API. OpenAPI spec available.

Packages integrated with Langfuse

NameDescription
FlowiseJS/TS no-code builder for customized LLM flows.
InstructorLibrary to get structured LLM outputs (JSON, Pydantic)
LangflowPython-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows.
LiteLLMUse any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs).
SuperagentOpen Source AI Assistant Framework & API for prototyping and deployment of agents.
AI SDK by VercelTypescript SDK that makes streaming LLM outputs super easy.

Unsure which integration to choose? Ask us on Discord or in the chat.

Was this page useful?

Questions? We're here to help

Subscribe to updates