Capability · Framework — observability
Langfuse
Langfuse is an open-source alternative to LangSmith. It provides distributed tracing for LLM apps (OpenAI, Anthropic, LangChain, LlamaIndex, Haystack, DSPy, Vercel AI SDK, and more via OTel), prompt management with versioning and A/B testing, datasets, online and offline evaluations, and dashboards for cost/latency/quality. You can self-host with Docker or use Langfuse Cloud (EU/US).
Framework facts
- Category
- observability
- Language
- Python / TypeScript
- License
- MIT (core) / EE
- Repository
- https://github.com/langfuse/langfuse
Install
pip install langfuse
# or
npm install langfuse Quickstart
from langfuse import Langfuse, observe
langfuse = Langfuse(
public_key='pk-lf-...',
secret_key='sk-lf-...',
host='https://cloud.langfuse.com'
)
@observe()
def answer(q: str) -> str:
return f'echo: {q}'
answer('hello') Alternatives
- LangSmith — LangChain's commercial alternative
- Arize Phoenix — Apache-licensed, OTel-native
- Helicone — proxy-based observability
- Traceloop — OTel-based LLM tracing
Frequently asked questions
Is the self-hosted Langfuse fully featured?
The core (tracing, datasets, prompts, evals) is MIT-licensed and self-hostable. A small set of advanced features (SSO/SAML, advanced RBAC, analytics) are enterprise-only. Most teams run the self-hosted core in production successfully.
Langfuse or LangSmith?
Langfuse if you want open-source, self-hosted, or framework-agnostic LLM observability with strong EU/US cloud options. LangSmith if you're all-in on LangChain/LangGraph and want a tightly integrated commercial experience. Both are excellent.
Sources
- Langfuse — docs — accessed 2026-04-20
- Langfuse — GitHub — accessed 2026-04-20