Integration Overview
Integrate your application with Langfuse to explore production traces and metrics.
Objective:
- Capture traces of your application
- Add scores to these traces to measure/evaluate quality of outputs
There are currently six main ways to integrate with Langfuse:
Main Integrations
Integration | Supports | Description |
---|---|---|
SDK | Python, JS/TS | Manual instrumentation using the SDKs for full flexibility. |
OpenAI | Python, JS/TS | Automated instrumentation using drop-in replacement of OpenAI SDK. |
Langchain | Python, JS/TS | Automated instrumentation by passing callback handler to Langchain application. |
LlamaIndex | Python | Automated instrumentation via LlamaIndex callback system. |
Haystack | Python | Automated instrumentation via Haystack content tracing system. |
LiteLLM | Python, JS/TS (proxy only) | Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs). |
Vercel AI SDK | JS/TS | TypeScript toolkit designed to help developers build AI-powered applications with React, Next.js, Vue, Svelte, Node.js. |
API | Directly call the public API. OpenAPI spec available. |
Packages integrated with Langfuse
Name | Description |
---|---|
Instructor | Library to get structured LLM outputs (JSON, Pydantic) |
DSPy | Framework that systematically optimizes language model prompts and weights |
Dify | Open source LLM app development platform with no-code builder. |
Ollama | Easily run open source LLMs on your own machine. |
Mirascope | Python toolkit for building LLM applications. |
Flowise | JS/TS no-code builder for customized LLM flows. |
Langflow | Python-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows. |
Unsure which integration to choose? Ask us on Discord or in the chat.
End to end examples
If you want to see how things work together, you can look at the end-to-end examples below. They are Jupyter notebooks that you can easily run in Google Colab or locally.
Generally, we recommend reading the get started guides for each integration first.
Integrations
Integration Azure Openai Langchain→Integration Dspy→Integration Haystack→Integration Langchain→Integration Langserve→Integration Litellm Proxy→Integration Llama Index→Integration Llama Index Instrumentation→Integration Llama Index Milvus Lite→Integration Mirascope→Integration Mistral Sdk→Integration Openai Sdk→JS Integration Langchain→JS Integration Litellm Proxy→JS Integration Openai→JS Tracing Example Vercel Ai Sdk→Observe OpenAI Structured Outputs with Langfuse→Ollama Observability and Tracing for local LLMs using Langfuse→Open Source Observability for LangGraph→OSS Observability for Instructor→OSS Observability for OpenAI Assistants API→