This is a Jupyter notebook

Observability for xAI / Grok with Langfuse

This guide shows you how to integrate Grok with Langfuse using the OpenAI SDK.

What is Grok? Grok is X.ai’s advanced AI platform that streamlines natural language processing for intelligent application integration. Learn more at Grok Documentation.

What is Langfuse? Langfuse is an open source LLM engineering platform that helps teams trace API calls, monitor performance, and debug issues in their AI applications.

Step 1: Install Dependencies

Make sure you have installed the necessary Python packages:

%pip install openai langfuse

Step 2: Set Up Environment Variables

import os
 
# Get keys for your project from the project settings page
# https://cloud.langfuse.com
 
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..." # DOCS EXAMPLE KEYS
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..." # DOCS EXAMPLE KEYS
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 EU region
# os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # 🇺🇸 US region
 
 
# Get your Grok API key from your Grok account settings
os.environ["GROK_API_KEY"] = "xai-..."

Step 3: Use Grok with the OpenAI SDK

To utilize Grok through the OpenAI SDK, we use the Langfuse drop-in replacement for OpenAI. Replace the base URL with Grok’s endpoint.

# Instead of importing openai directly:
from langfuse.openai import openai
 
client = openai.OpenAI(
  api_key=os.environ.get("GROK_API_KEY"),
  base_url="https://api.x.ai/v1"  # Grok's endpoint
)

Step 4: Run an Example

The following example demonstrates how to make a simple request using Grok’s API. All API calls will be automatically traced by Langfuse.

response = client.chat.completions.create(
  model="grok-2-latest",
  messages=[
    {"role": "system", "content": "You are an assistant."},
    {"role": "user", "content": "What is Langfuse?"}
  ],
  name = "Grok-2-Trace"
)
 
print(response.choices[0].message.content)
Langfuse is an observability and debugging tool specifically designed for Large Language Model (LLM) applications. It helps developers and engineers monitor, debug, and improve their LLM-powered applications by providing insights into the performance and behavior of the models. Key features of Langfuse include:

- **Tracing**: Allows you to track the flow of requests through your application, helping you understand how different components interact.
- **Metrics**: Provides quantitative data on the performance of your LLM, such as latency, throughput, and error rates.
- **Logs**: Captures detailed logs of interactions with the LLM, which can be invaluable for debugging and understanding model behavior.
- **Analytics**: Offers analytics to help you optimize your application based on real usage data.

Langfuse is particularly useful for teams working on complex AI-driven applications, as it helps in identifying issues, optimizing performance, and ensuring the reliability of the system.

Step 5: Enhance Tracing (Optional)

You can enhance your Grok traces:

Visit the OpenAI SDK cookbook to see more examples on passing additional parameters. Find out more about Langfuse Evaluations and Prompt Management in the Langfuse documentation.

Step 6: See Traces in Langfuse

After running the example, log in to Langfuse to view the detailed traces, including:

  • Request parameters
  • Response content
  • Token usage and latency metrics
Langfuse Trace Example

Public example trace link in Langfuse

Resources

Was this page useful?

Questions? We're here to help

Subscribe to updates