Observability for Mastra with Langfuse
This guide shows you how to integrate Langfuse with Mastra for observability and tracing. By following these steps, you’ll be able to monitor and debug your Mastra agents in the Langfuse dashboard.
What is Mastra? Mastra is the TypeScript agent framework designed to provide the essential primitives for building AI applications. It enables developers to create AI agents with memory and tool-calling capabilities, implement deterministic LLM workflows, and leverage RAG for knowledge integration.
Integration
Create a Mastra project
If you don’t have a Mastra project yet, you can create one using the Mastra CLI:
npx create-mastra
Move into the project directory:
cd your-mastra-project
You can get the full Mastra installation instructions here
Set up Langfuse project
Create a project in Langfuse and get your API keys from the project settings page.
Add environment variables
Create or update your .env.development
file with the following variables:
# Your LLM API key
OPENAI_API_KEY=your-api-key
# Langfuse credentials
LANGFUSE_SECRET_KEY=sk-lf-...
LANGFUSE_PUBLIC_KEY=pk-lf-...
LANGFUSE_HOST=https://cloud.langfuse.com # Optional. Defaults to https://cloud.langfuse.com
Install the langfuse-vercel
package
Add the langfuse-vercel
package to your project:
npm install langfuse-vercel
Set up an agent
Create an agent in your project. For example, create a file agents/chefAgent.ts
:
import { Agent } from "@mastra/core/agent";
import { openai } from "@ai-sdk/openai";
export const chefAgent = new Agent({
name: "chef-agent",
instructions:
"You are Michel, a practical and experienced home chef " +
"You help people cook with whatever ingredients they have available.",
model: openai("gpt-4o-mini"),
});
You can use any model provider from ai-sdk
.
Register agent and configure Langfuse
Create or update your Mastra instance configuration to register the agent and configure Langfuse integration. For example, create a file mastra.ts
:
import { Mastra } from "@mastra/core";
import { LangfuseExporter } from "langfuse-vercel";
import { chefAgent } from "./agents/chefAgent";
export const mastra = new Mastra({
agents: { chefAgent },
telemetry: {
serviceName: "ai", // this must be set to "ai" so that the LangfuseExporter thinks it's an AI SDK trace
enabled: true,
export: {
type: "custom",
exporter: new LangfuseExporter({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: process.env.LANGFUSE_HOST,
}),
},
},
});
The telemetry.serviceName
field must be set to “ai” so that the LangfuseExporter thinks it’s an AI SDK trace.
Run mastra dev server
Start the Mastra development server:
npm run dev
Head over to the developer playground with the provided URL and start chatting with your agent.
View traces in Langfuse
Head over to your Langfuse dashboard and you’ll see the traces from your agent interactions. You can analyze the prompts, completions, and other details of your AI interactions.
Here’s an example of a trace: