Amazon Bedrock AgentCore Integration
Trace AI agents built with Amazon Bedrock AgentCore via OpenTelemetry and Langfuse
Day 2 of Launch Week 4 brings a new integration with Amazon Bedrock AgentCore, enabling comprehensive observability for AI agents deployed on AWS infrastructure.
Amazon Bedrock AgentCore is a managed service that enables you to build, deploy, and manage AI agents in production. With our new integration guide, you can now trace your AgentCore agents using OpenTelemetry and get full visibility into agent executions, LLM calls, tool usage, and MCP interactions.
What this enables
The integration supports distributed tracing, allowing you to connect traces from your local development environment to production AgentCore deployments. This enables you to:
- Monitor complete agent execution flows in production
- Track LLM calls with token counts and costs
- Debug tool usage and MCP interactions
- Analyze latency metrics at each step
- Maintain trace continuity across distributed systems
We’ve also included a comprehensive example repository from @aristsakpinis93 that demonstrates a complete continuous evaluation loop with AgentCore and Langfuse, covering experimentation, QA testing, and production monitoring.