Public API
Langfuse is open and meant to be extended via custom workflows and integrations. All Langfuse data and features are available via the API.
/api/publicReferences:
- API Reference: https://api.reference.langfuse.com
- OpenAPI spec: https://cloud.langfuse.com/generated/api/openapi.yml
- Postman collection: https://cloud.langfuse.com/generated/postman/collection.json
There are 3 different groups of APIs:
- This page -> Project-level APIs: CRUD traces/evals/prompts/configuration within a project
- Organization-level APIs: provision projects, users (SCIM), and permissions
- Instance Management API: administer organizations on self-hosted installations
Authentication
Authenticate with the API using Basic Auth. The API keys are available in the Langfuse project settings.
- Username: Langfuse Public Key
- Password: Langfuse Secret Key
Example:
curl -u public-key:secret-key https://cloud.langfuse.com/api/public/projectsAccess via SDKs
Both the Langfuse Python SDK and the JS/TS SDK provide a strongly-typed wrapper around our public REST API for your convenience. The API methods are accessible via the api property on the Langfuse client instance in both SDKs.
You can use your editor’s Intellisense to explore the API methods and their parameters.
When fetching prompts, please use the get_prompt (Python) / getPrompt (JS/TS) methods on the Langfuse client to benefit from client-side caching, automatic retries, and fallbacks.
When using the Python SDK:
from langfuse import get_client
langfuse = get_client()
...
# fetch a trace
langfuse.api.trace.get(trace_id)
# async client via asyncio
await langfuse.async_api.trace(trace_id)
# explore more endpoints via Intellisense
langfuse.api.*
await langfuse.async_api.*Ingest Traces via the API
The OpenTelemetry Endpoint will replace the Ingestion API in the future. Therefore, it is strongly recommended to switch to the OpenTelemetry Endpoint for trace ingestion. Please refer to the OpenTelemetry docs for more information.
- OpenTelemetry Traces Ingestion Endpoint implements the OTLP/HTTP specification for trace ingestion, providing native OpenTelemetry integration for Langfuse Observability.
- (Legacy) Ingestion API allows trace ingestion using an API.
Retrieve Data via the API
- Observations API - Retrieve observation data (spans, generations, events) from Langfuse for use in custom workflows, evaluation pipelines, and analytics. The v2 API offers high-performance data retrieval with cursor-based pagination and selective field retrieval.
- Metrics API - Retrieve aggregated analytics and metrics from your Langfuse data. Query across different views (observations, scores) with customizable dimensions, metrics, filters, and time granularity for powerful custom reports and dashboards.
Alternatives
You can also export data via:
- UI - Manual batch-exports from the Langfuse UI
- Blob Storage - Scheduled automated exports to cloud storage