Capability · Framework — orchestration
LangChain
LangChain is the most widely-adopted open-source framework for building applications on top of LLMs. It provides abstractions for chains, agents, tool use, memory, and retrieval — plus a commercial layer (LangSmith for observability, LangGraph Cloud for deployment) that's become standard in production AI engineering.
Framework facts
- Category
- orchestration
- Language
- Python / TypeScript
- License
- MIT
- Repository
- https://github.com/langchain-ai/langchain
Install
pip install langchain langgraph langchain-anthropic
# or
npm install langchain @langchain/core @langchain/anthropic Quickstart
from langchain_anthropic import ChatAnthropic
from langgraph.prebuilt import create_react_agent
from langchain_core.tools import tool
@tool
def search(query: str) -> str:
'''Search the web.'''
return f'results for {query}'
model = ChatAnthropic(model='claude-opus-4-7')
agent = create_react_agent(model, tools=[search])
result = agent.invoke({'messages': [('user', 'what is MCP?')]}) Alternatives
- LlamaIndex — more opinionated for RAG
- Haystack — enterprise-focused, pipeline-centric
- Semantic Kernel — Microsoft's .NET-first equivalent
- DSPy — programmatic prompt optimisation instead of chains
Frequently asked questions
Should I use LangChain or LangGraph?
LangGraph is the recommended agent API inside the LangChain ecosystem. Start with LangGraph for anything with multiple steps, tools, or state. Use plain LangChain for simpler chains and retrieval pipelines.
Is LangChain worth learning in 2026?
Yes, especially if you're building production agents. Despite competition from simpler frameworks, LangChain + LangGraph + LangSmith remain the most complete stack for serious agent engineering.
Sources
- LangChain — docs — accessed 2026-04-20
- LangGraph — docs — accessed 2026-04-20