
Khan Academy uses Langfuse’s LLM Engineering platform to build Khanmigo AI
How Khan Academy uses Langfuse to debug and improve Khanmigo AI, a student tutor and teaching assistant.
About Khan Academy
Khan Academy is the beloved personalized learning system with over 150 million registered learners across 190 countries in 56 languages. It provides free online courses, lessons, and practice exercises covering subjects from elementary math to college-level topics like calculus, physics, economics, and computer science. Khan Academy also provides tools for teachers and parents to track student progress. Khan Academy’s mission: “A free, world-class education for anyone, anywhere.”
Pioneering AI in Education
Khan Academy started their journey working with large language models in 2022 as an early access partner of OpenAI’s GPT-4. They have learned many hard-won lessons and shipped dozens of cornerstone AI experiences led by their flagship AI product: Khanmigo.
Challenge: Go-Based Infrastructure Meets AI
Running a nonprofit at this scale demands extreme resource efficiency.
Khan Academy migrated to Go from Python2 starting in 2019. When it came time to implement AI observability, they faced a unique challenge. Most tracing tools required specific SDKs or wrappers that didn’t fit their Go-based architecture.
“What we really loved about Langfuse was the open API,” says Walt from Khan Academy’s engineering team. “While many tracing tools require specific SDKs or wrappers, Langfuse’s open API enabled us to build our own Golang client around it. This was huge for our Go-based services.”
LLM Observability Across Product Teams
Since deploying Langfuse in April 2024 as one of Langfuse’s first enterprise customers, adoption has spread to over 100 users across 7 product and 4 infrastructure teams within Khan Academy. The platform isn’t providing deep observability for just one AI feature but dozens of AI interactions across different products.
"Langfuse has really enabled our developers to get extremely fast feedback. When building and deploying features, we can quickly watch how those experiences are going. Langfuse is fundamental to how our developers understand their AI implementations.

The integration runs deep:
- Internal UIs link directly to Langfuse traces for user experience analysis
- Community support teams access traces for incident investigation
- Engineers share Langfuse URLs for collaborative debugging
- Senior leadership uses the platform for product decision analysis.
Results: Speed Without Compromise
For Khan Academy, Langfuse solved multiple challenges simultaneously. The company avoided the engineering overhead of hosting their own tracing infrastructure while maintaining the flexibility to work within their Go ecosystem.
Stack Flexibility
Langfuse plays well with Khan Academy's Go stack. They've built their custom Go client around the Langfuse API.
Development Speed
Deep observability is the foundation for rapid iteration and debugging capabilities across products.
Hosting Simplicity
The nonprofit avoided the engineering overhead of hosting its own tracing infrastructure.
Platform across Teams
Development teams across the organization now share a common observability platform, preventing tool fragmentation.
"In early 2024, Langfuse was a big bet for us, and we've been overwhelmingly happy with that bet. Everything about choosing Langfuse has paid off in spades.

Ready to get started with Langfuse?
Join thousands of teams building better LLM applications with Langfuse's open-source observability platform.
No credit card required • Free tier available • Self-hosting option