DocsPrompt ManagementFeaturesGuaranteed Availability

Guaranteed Availability

💡

Implementing this is usually not necessary as it adds complexity to your application. The Langfuse Prompt Management is highly available due to multiple caching layers and we closely monitor its performance (status page). However, if you require 100% availability, you can use the following options.

The Langfuse API has high uptime and prompts are cached locally in the SDKs to prevent network issues from affecting your application.

However, get_prompt()/getPrompt() will throw an exception if:

  • No local (fresh or stale) cached prompt is available -> new application instance fetching prompt for the first time
  • and network request fails -> networking or Langfuse API issue (after retries)

To guarantee 100% availability, there are two options:

  1. Pre-fetch prompts on application startup and exit the application if the prompt is not available.
  2. Provide a fallback prompt that will be used in these cases.

Option 1: Pre-fetch prompts

Pre-fetch prompts on application startup and exit the application if the prompt is not available.

from flask import Flask, jsonify
from langfuse import Langfuse
 
# Initialize the Flask app and Langfuse client
app = Flask(__name__)
langfuse = Langfuse()
 
def fetch_prompts_on_startup():
    try:
        # Fetch and cache the production version of the prompt
        langfuse.get_prompt("movie-critic")
    except Exception as e:
        print(f"Failed to fetch prompt on startup: {e}")
        sys.exit(1)  # Exit the application if the prompt is not available
 
# Call the function during application startup
fetch_prompts_on_startup()
 
@app.route('/get-movie-prompt/<movie>', methods=['GET'])
def get_movie_prompt(movie):
    prompt = langfuse.get_prompt("movie-critic")
    compiled_prompt = prompt.compile(criticlevel="expert", movie=movie)
    return jsonify({"prompt": compiled_prompt})
 
if __name__ == '__main__':
    app.run(debug=True)

Option 2: Fallback

Provide a fallback prompt that will be used in these cases:

from langfuse import Langfuse
langfuse = Langfuse()
 
# Get `text` prompt with fallback
prompt = langfuse.get_prompt(
  "movie-critic",
  fallback="Do you like {{movie}}?"
)
 
# Get `chat` prompt with fallback
chat_prompt = langfuse.get_prompt(
  "movie-critic-chat",
  type="chat",
  fallback=[{"role": "system", "content": "You are an expert on {{movie}}"}]
)
 
# True if the prompt is a fallback
prompt.is_fallback
Was this page helpful?