IntegrationsModel ProvidersxAI Grok
This is a Jupyter notebook

Observability for xAI / Grok with Langfuse

This guide shows you how to integrate Grok with Langfuse using the OpenAI SDK.

What is Grok? Grok is X.ai’s advanced AI platform that streamlines natural language processing for intelligent application integration. Learn more at Grok Documentation.

What is Langfuse? Langfuse is an open source LLM engineering platform that helps teams trace API calls, monitor performance, and debug issues in their AI applications.

Step 1: Install Dependencies

Make sure you have installed the necessary Python packages:

%pip install openai langfuse

Step 2: Set Up Environment Variables

import os
 
# Get keys for your project from the project settings page
# https://cloud.langfuse.com
 
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
os.environ["LANGFUSE_BASE_URL"] = "https://cloud.langfuse.com" # 🇪🇺 EU region
# os.environ["LANGFUSE_BASE_URL"] = "https://us.cloud.langfuse.com" # 🇺🇸 US region
 
 
# Get your Grok API key from your Grok account settings
os.environ["GROK_API_KEY"] = "xai-..."

Step 3: Use Grok with the OpenAI SDK

To utilize Grok through the OpenAI SDK, we use the Langfuse drop-in replacement for OpenAI. Replace the base URL with Grok’s endpoint.

# Instead of importing openai directly:
from langfuse.openai import openai
 
client = openai.OpenAI(
  api_key=os.environ.get("GROK_API_KEY"),
  base_url="https://api.x.ai/v1"  # Grok's endpoint
)

Step 4: Run an Example

The following example demonstrates how to make a simple request using Grok’s API. All API calls will be automatically traced by Langfuse.

response = client.chat.completions.create(
  model="grok-2-latest",
  messages=[
    {"role": "system", "content": "You are an assistant."},
    {"role": "user", "content": "What is Langfuse?"}
  ],
  name = "Grok-2-Trace"
)
 
print(response.choices[0].message.content)

Step 5: Enhance Tracing (Optional)

You can enhance your Grok traces:

Visit the OpenAI SDK cookbook to see more examples on passing additional parameters. Find out more about Langfuse Evaluations and Prompt Management in the Langfuse documentation.

Step 6: See Traces in Langfuse

After running the example, log in to Langfuse to view the detailed traces, including:

  • Request parameters
  • Response content
  • Token usage and latency metrics

Langfuse Trace Example

Public example trace link in Langfuse

Interoperability with the Python SDK

You can use this integration together with the Langfuse SDKs to add additional attributes to the trace.

The @observe() decorator provides a convenient way to automatically wrap your instrumented code and add additional attributes to the trace.

from langfuse import observe, propagate_attributes, get_client
 
langfuse = get_client()
 
@observe()
def my_llm_pipeline(input):
    # Add additional attributes (user_id, session_id, metadata, version, tags) to all spans created within this execution scope
    with propagate_attributes(
        user_id="user_123",
        session_id="session_abc",
        tags=["agent", "my-trace"],
        metadata={"email": "user@langfuse.com"},
        version="1.0.0"
    ):
 
        # YOUR APPLICATION CODE HERE
        result = call_llm(input)
 
        # Update the trace input and output
        langfuse.update_current_trace(
            input=input,
            output=result,
        )
 
        return result

Learn more about using the Decorator in the Langfuse SDK instrumentation docs.

Troubleshooting

No traces appearing

First, enable debug mode in the Python SDK:

export LANGFUSE_DEBUG="True"

Then run your application and check the debug logs:

  • OTel spans appear in the logs: Your application is instrumented correctly but traces are not reaching Langfuse. To resolve this:
    1. Call langfuse.flush() at the end of your application to ensure all traces are exported.
    2. Verify that you are using the correct API keys and base URL.
  • No OTel spans in the logs: Your application is not instrumented correctly. Make sure the instrumentation runs before your application code.
Unwanted observations in Langfuse

The Langfuse SDK is based on OpenTelemetry. Other libraries in your application may emit OTel spans that are not relevant to you. These still count toward your billable units, so you should filter them out. See Unwanted spans in Langfuse for details.

Missing attributes

Some attributes may be stored in the metadata object of the observation rather than being mapped to the Langfuse data model. If a mapping or integration does not work as expected, please raise an issue on GitHub.

Next Steps

Once you have instrumented your code, you can manage, evaluate and debug your application:

Was this page helpful?