Observability for xAI / Grok with Langfuse
This guide shows you how to integrate Grok with Langfuse using the OpenAI SDK.
What is Grok? Grok is X.ai’s advanced AI platform that streamlines natural language processing for intelligent application integration. Learn more at Grok Documentation.
What is Langfuse? Langfuse is an open source LLM engineering platform that helps teams trace API calls, monitor performance, and debug issues in their AI applications.
Step 1: Install Dependencies
Make sure you have installed the necessary Python packages:
%pip install openai langfuse
Step 2: Set Up Environment Variables
import os
# Get keys for your project from the project settings page
# https://cloud.langfuse.com
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 EU region
# os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # 🇺🇸 US region
# Get your Grok API key from your Grok account settings
os.environ["GROK_API_KEY"] = "xai-..."
Step 3: Use Grok with the OpenAI SDK
To utilize Grok through the OpenAI SDK, we use the Langfuse drop-in replacement for OpenAI. Replace the base URL with Grok’s endpoint.
# Instead of importing openai directly:
from langfuse.openai import openai
client = openai.OpenAI(
api_key=os.environ.get("GROK_API_KEY"),
base_url="https://api.x.ai/v1" # Grok's endpoint
)
Step 4: Run an Example
The following example demonstrates how to make a simple request using Grok’s API. All API calls will be automatically traced by Langfuse.
response = client.chat.completions.create(
model="grok-2-latest",
messages=[
{"role": "system", "content": "You are an assistant."},
{"role": "user", "content": "What is Langfuse?"}
],
name = "Grok-2-Trace"
)
print(response.choices[0].message.content)
Step 5: Enhance Tracing (Optional)
You can enhance your Grok traces:
- Add metadata, tags, log levels and user IDs to traces
- Group traces by sessions
@observe()
decorator to trace additional application logic- Use Langfuse Prompt Management and link prompts to traces
- Add score to traces
Visit the OpenAI SDK cookbook to see more examples on passing additional parameters. Find out more about Langfuse Evaluations and Prompt Management in the Langfuse documentation.
Step 6: See Traces in Langfuse
After running the example, log in to Langfuse to view the detailed traces, including:
- Request parameters
- Response content
- Token usage and latency metrics
Interoperability with the Python SDK
You can use this integration together with the Langfuse Python SDK to add additional attributes to the trace.
The @observe()
decorator provides a convenient way to automatically wrap your instrumented code and add additional attributes to the trace.
from langfuse import observe, get_client
langfuse = get_client()
@observe()
def my_instrumented_function(input):
output = my_llm_call(input)
langfuse.update_current_trace(
input=input,
output=output,
user_id="user_123",
session_id="session_abc",
tags=["agent", "my-trace"],
metadata={"email": "[email protected]"},
version="1.0.0"
)
return output
Learn more about using the Decorator in the Python SDK docs.
Next Steps
Once you have instrumented your code, you can manage, evaluate and debug your application: