This is a Jupyter notebook

Observability for Fireworks AI with Langfuse

This guide shows you how to integrate Fireworks AI with Langfuse. Fireworks AI’s API endpoints are fully compatible with the OpenAI SDK, allowing you to trace and monitor your AI applications seamlessly.

What is Fireworks AI? Fireworks AI is a platform that provides API access to state-of-the-art open-source and proprietary AI models with OpenAI-compatible endpoints.

What is Langfuse? Langfuse is an open source LLM engineering platform that helps teams trace API calls, monitor performance, and debug issues in their AI applications.

Step 1: Install Dependencies

%pip install openai langfuse

Step 2: Set Up Environment Variables

import os
 
# Get keys for your project from the project settings page
# https://cloud.langfuse.com
 
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..." 
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 EU region
# os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # 🇺🇸 US region
 
# Set your Fireworks API details
os.environ["FIREWORKS_AI_API_BASE"] = "https://api.fireworks.ai/inference/v1"
os.environ["FIREWORKS_AI_API_KEY"] = "fw_..."

Step 3: Use Langfuse OpenAI Drop-in Replacement

from langfuse.openai import openai
 
client = openai.OpenAI(
  api_key=os.environ.get("FIREWORKS_AI_API_KEY"),
  base_url=os.environ.get("FIREWORKS_AI_API_BASE")
)

Step 4: Run an Example

response = client.chat.completions.create(
  model="accounts/fireworks/models/llama-v3p1-8b-instruct",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Why is open source important?"},
  ],
  name = "Fireworks-AI-Trace" # name of the trace
)
print(response.choices[0].message.content)

Step 5: See Traces in Langfuse

After running the example, log in to Langfuse to view the detailed traces, including:

  • Request parameters
  • Response content
  • Token usage and latency metrics

Langfuse Trace Example

Public example trace link in Langfuse

Interoperability with the Python SDK

You can use this integration together with the Langfuse Python SDK to add additional attributes to the trace.

The @observe() decorator provides a convenient way to automatically wrap your instrumented code and add additional attributes to the trace.

from langfuse import observe, get_client
 
langfuse = get_client()
 
@observe()
def my_instrumented_function(input):
    output = my_llm_call(input)
 
    langfuse.update_current_trace(
        input=input,
        output=output,
        user_id="user_123",
        session_id="session_abc",
        tags=["agent", "my-trace"],
        metadata={"email": "[email protected]"},
        version="1.0.0"
    )
 
    return output

Learn more about using the Decorator in the Python SDK docs.

Next Steps

Once you have instrumented your code, you can manage, evaluate and debug your application:

Was this page helpful?