Variables in Prompts
Variables are placeholders for dynamic strings in your prompts. They allow you to create flexible prompt templates that can be customized at runtime without changing the prompt definition itself.
All prompts support variables using the {{variable}} syntax. When you fetch a prompt from Langfuse and compile it, you provide values for these variables that get inserted into the prompt template.
Get started
Create prompt with variables
When creating a prompt in the Langfuse UI, simply use double curly braces {{variable_name}} to define a variable anywhere in your prompt text.

Variables work in both text prompts and chat prompts. You can use them in any message content.
Compile variables at runtime
In your application, use the .compile() method to replace variables with actual values. Pass the variables as keyword arguments (Python) or an object (JavaScript/TypeScript).
from langfuse import get_client
langfuse = get_client()
# Get the prompt
prompt = langfuse.get_prompt("movie-critic")
# Compile with variable values
compiled_prompt = prompt.compile(
criticLevel="expert",
movie="Dune 2"
)
# -> compiled_prompt = "As an expert movie critic, do you like Dune 2?"
# Use with your LLM
response = openai.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": compiled_prompt}]
)Not exactly what you need? Consider these similar features:
- Prompt references for reusing sub-prompts
- Message placeholders for inserting arrays of complete messages instead of strings
Or related FAQ pages:
- Can I dynamically select sub-prompts at runtime?
- Using external templating libraries (Jinja, Liquid, etc.) with Langfuse prompts