Agentic Onboarding & Docs MCP Server


Integrate Langfuse into your existing codebase using AI agents like Cursor, GitHub Copilot, and Windsurf. Powered by the new Langfuse Docs MCP server.
To make the Langfuse docs more accessible to AI agents, we’ve added a new MCP Server. This server exposes relevant context from the Langfuse docs, GitHub Issues, and GitHub Discussions to your agent.
Instead of manually integrating Langfuse into your existing codebase, you can now use agents like Cursor, GitHub Copilot, and Windsurf to automatically add tracing to your LLM applications.
Thank you @JohnRichardEnders for your contribution to this!
Getting started
On the Get Started page, you’ll now find a new ”✨ Agentic” tab that provides:
- One-click MCP server installation for popular editors
- Step-by-step setup instructions for Cursor, GitHub Copilot in VSCode, Windsurf, and other MCP clients
- A ready-to-use prompt that you can copy and paste into your agent
The agent will automatically:
- Detect which LLM libraries you’re using (OpenAI, Anthropic, LangChain, etc.)
- Choose the appropriate Langfuse integration method
- Add the necessary imports and configuration
- Wrap your existing LLM calls with Langfuse tracing
Try it out
This feature is experimental, so results may vary depending on your codebase complexity. We’d love to hear your feedback and experiences on GitHub.