Observability for CometAPI with Langfuse
This guide shows you how to integrate CometAPI with Langfuse. CometAPI’s API endpoints for chat, language and code are fully compatible with OpenAI’s API. This allows us to use the Langfuse OpenAI drop-in replacement to trace all parts of your application.
What is CometAPI? CometAPI is a unified AI model API platform providing access to 500+ AI models through a single OpenAI-compatible interface. Whether you need chat models, embeddings, or specialized AI capabilities, CometAPI offers affordable and reliable access with simple integration.
What is Langfuse? Langfuse is an open source LLM engineering platform that helps teams trace API calls, monitor performance, and debug issues in their AI applications.
Step 1: Install Dependencies
Make sure you have installed the necessary Python packages:
%pip install openai langfuse
Step 2: Set Up Environment Variables
import os
# Get keys for your project from the project settings page
# https://cloud.langfuse.com
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-..."
os.environ["LANGFUSE_SECRET_KEY"] = "sk-..."
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 EU region
# os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # 🇺🇸 US region
# Get your CometAPI API key from https://api.cometapi.com/console/token
os.environ["COMETAPI_KEY"] = "..."
Step 3: Langfuse OpenAI drop-in Replacement
In this step we use the native OpenAI drop-in replacement by importing from langfuse.openai import openai
.
To start using CometAPI with OpenAI’s client libraries, pass in your CometAPI API key to the api_key
option, and change the base_url
to https://api.cometapi.com/v1/
:
# instead of import openai:
from langfuse.openai import openai
client = openai.OpenAI(
api_key=os.environ.get("COMETAPI_KEY"),
base_url="https://api.cometapi.com/v1/",
)
Note: The OpenAI drop-in replacement is fully compatible with the Low-Level Langfuse Python SDKs and @observe()
decorator to trace all parts of your application.
Step 4: Run An Example
The following cell demonstrates how to call CometAPI’s chat model using the traced OpenAI client. All API calls will be automatically traced by Langfuse.
client = openai.OpenAI(
api_key=os.environ.get("COMETAPI_KEY"),
base_url="https://api.cometapi.com/v1/",
)
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": "Act like you are a helpful assistant."},
{"role": "user", "content": "What are the famous attractions in San Francisco?"},
]
)
print(response.choices[0].message.content)
Step 5: See Traces in Langfuse
After running the example model call, you can see the traces in Langfuse. You will see detailed information about your CometAPI API calls, including:
- Request parameters (model, messages, temperature, etc.)
- Response content
- Token usage statistics
- Latency metrics