DocsOpenTelemetryExample: OpenLLMetry
This is a Jupyter notebook

OpenLLMetry Integration via OpenTelemetry

%pip install openai traceloop-sdk
import os
import base64
 
LANGFUSE_PUBLIC_KEY=""
LANGFUSE_SECRET_KEY=""
LANGFUSE_AUTH=base64.b64encode(f"{LANGFUSE_PUBLIC_KEY}:{LANGFUSE_SECRET_KEY}".encode()).decode()
 
os.environ["TRACELOOP_BASE_URL"] = "https://cloud.langfuse.com/api/public/otel" # EU data region
# os.environ["TRACELOOP_BASE_URL"] = "https://us.cloud.langfuse.com/api/public/otel" # US data region
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = f"Authorization=Basic {LANGFUSE_AUTH}"
 
# your openai key
os.environ["OPENAI_API_KEY"] = ""

OpenAI SDK

from openai import OpenAI
from traceloop.sdk import Traceloop
 
Traceloop.init(disable_batch=True)
 
openai_client = OpenAI()
 
chat_completion = openai_client.chat.completions.create(
    messages=[
        {
          "role": "user",
          "content": "What is LLM Observability?",
        }
    ],
    model="gpt-4o-mini",
)
 
print(chat_completion)

Example Trace

OpenLLMetry OpenAI Trace

Was this page useful?

Questions? We're here to help

Subscribe to updates