Get Started with Langfuse Tracing
This quickstart helps you to integrate your LLM application with Langfuse Tracing. It will log a single LLM call to get you started.
If you are looking for other features, see the overview.
Create new project in Langfuse
- Create Langfuse account or self-host
- Create a new project
- Create new API credentials in the project settings
Log your first LLM call to Langfuse
The @observe()
decorator makes it easy to trace any Python LLM application. In this quickstart we also use the Langfuse OpenAI integration to automatically capture all model parameters.
Not using OpenAI? Switch to the βPython Decorator + any LLMβ tab.
pip install langfuse openai
.env
LANGFUSE_SECRET_KEY="sk-lf-..."
LANGFUSE_PUBLIC_KEY="pk-lf-..."
# πͺπΊ EU region
LANGFUSE_HOST="https://cloud.langfuse.com"
# οΏ½οΏ½ US region
# LANGFUSE_HOST="https://us.cloud.langfuse.com"
main.py
from langfuse.decorators import observe
from langfuse.openai import openai # OpenAI integration
@observe()
def story():
return openai.chat.completions.create(
model="gpt-3.5-turbo",
max_tokens=100,
messages=[
{"role": "system", "content": "You are a great storyteller."},
{"role": "user", "content": "Once upon a time in a galaxy far, far away..."}
],
).choices[0].message.content
@observe()
def main():
return story()
main()
β
Done, now visit the Langfuse interface to look at the trace you just created.
All Langfuse platform features
This was a very brief introduction to get started with Langfuse. Explore all Langfuse platform features in detail.
Develop
Monitor
Test
References
Python DecoratorPython low-level SDKJS/TS SDKOpenAI SDKπ¦πLangchainπ¦LlamaIndexAPI referenceFlowiseLangflowLitellm
FAQ
- How to use Langfuse Tracing in Serverless Functions (AWS Lambda, Vercel, Cloudflare Workers, etc.)
- What data regions does Langfuse Cloud support?
- How to manage different environments in Langfuse?
- I have setup Langfuse, but I do not see any traces in the dashboard. How to solve this?
- Where do I find my Langfuse API keys?