Capability · Framework — observability
LangSmith
LangSmith is the production platform for LangChain/LangGraph applications (and any OpenTelemetry-compatible LLM app). It provides distributed tracing of chains, agents, and tool calls; dataset collection from production traces; online and offline evaluations with LLM-as-judge; prompt versioning and playground; and dashboards for cost, latency, and quality. Self-hosted and cloud options are available.
Framework facts
- Category
- observability
- Language
- Python / TypeScript
- License
- commercial
- Repository
- https://github.com/langchain-ai/langsmith-sdk
Install
pip install langsmith
# or
npm install langsmith Quickstart
import os
from langsmith import traceable
os.environ['LANGSMITH_TRACING'] = 'true'
os.environ['LANGSMITH_API_KEY'] = '...'
@traceable
def answer(q: str) -> str:
return f'echo: {q}'
answer('hi') # appears in LangSmith UI Alternatives
- Langfuse — open-source alternative
- Arize Phoenix — OTel-based open source
- Helicone — proxy-based observability
- Braintrust — evals + monitoring
Frequently asked questions
Do I need to use LangChain to use LangSmith?
No. LangSmith accepts OpenTelemetry traces and has direct SDK support. You can instrument any LLM app — OpenAI SDK, Anthropic SDK, vanilla Python — and get the same tracing, datasets, and evals UX.
What does LangSmith cost?
LangSmith has a free Developer plan with a monthly trace cap, paid Plus/Enterprise tiers for teams, and self-hosted enterprise licensing for on-prem or VPC deployments. Check the pricing page for current limits.
Sources
- LangSmith — docs — accessed 2026-04-20
- LangSmith — site — accessed 2026-04-20