Prompt Management for Vercel AI SDK
Langfuse Prompt Management now integrates natively with the Vercel AI SDK. Version and release prompts in Langfuse, use them via Vercel AI SDK, monitor metrics in Langfuse.
Kicking off Launch Week 2 one day early with the first of 6 announcements.
Introduction
What is the Vercel AI SDK? The Vercel AI SDK is a production-ready framework for building AI applications in JavaScript/TypeScript. It features type-safe outputs, React state management hooks, and streaming support while remaining model-agnostic.
Langfuse Tracing for Vercel AI SDK. Langfuse Tracing integrates with your Vercel AI SDK application via OpenTelemetry to monitor LLM requests. Track key metrics like quality, cost, and latency through detailed application traces.
What’s new?
Langfuse Prompt Management now enables seamless prompt deployment and optimization. Key features include client-side caching with async refreshing and flexible management through UI, SDK, or API interfaces.
By linking Vercel AI SDK traces with Langfuse prompt versions, you can now:
- Track which prompt version was used in any trace
- Debug prompt-related issues
- Monitor performance metrics per prompt version
- Track prompt version usage
How to link a trace with a prompt version?
Prerequisites:
- Vercel AI SDK installed in your application (guide).
- Langfuse Tracing enabled for Vercel AI SDK (guide).
- Create a prompt in Langfuse (guide).
Link Langfuse prompts to Vercel AI SDK generations by setting the langfusePrompt
property in the metadata
field:
import { generateText } from "ai";
import { Langfuse } from "langfuse";
const langfuse = new Langfuse();
// fetch prompt from Langfuse Prompt Management
const fetchedPrompt = await langfuse.getPrompt("my-prompt");
const result = await generateText({
model: openai("gpt-4o"),
prompt: fetchedPrompt.prompt, // use prompt
experimental_telemetry: {
isEnabled: true, // enable tracing
metadata: {
langfusePrompt: fetchedPrompt.toJSON(), // link trace to prompt version
},
},
});
Example
The resulting generation will have the prompt linked to the trace in Langfuse.
Try it out
See the prompt management docs for more details.