Capability · Framework — rag
Haystack
Haystack (now at 2.x) is deepset's modular framework for LLM applications. You compose pipelines from typed components — retrievers, generators, rankers, converters, routers — and run them as graphs. Haystack is RAG-first but supports agents, multimodal, and tool use. It ships integrations for OpenAI, Anthropic, Cohere, Hugging Face, and major vector stores, plus deepset Cloud for managed deployments.
Framework facts
- Category
- rag
- Language
- Python
- License
- Apache 2.0
- Repository
- https://github.com/deepset-ai/haystack
Install
pip install haystack-ai Quickstart
from haystack import Pipeline
from haystack.components.generators import OpenAIGenerator
from haystack.components.builders import PromptBuilder
pipe = Pipeline()
pipe.add_component('prompt', PromptBuilder(template='Answer: {{question}}'))
pipe.add_component('llm', OpenAIGenerator(model='gpt-4o-mini'))
pipe.connect('prompt', 'llm')
result = pipe.run({'prompt': {'question': 'What is RAG?'}}) Alternatives
- LlamaIndex — more RAG-opinionated
- LangChain — broader ecosystem
- R2R — RAG server out of the box
- txtai — embeddings-first alternative
Frequently asked questions
Haystack or LangChain for RAG?
Haystack's typed pipeline model is strict and easier to test and deploy — a good choice for enterprise RAG. LangChain has a larger ecosystem and more experimental features. Many teams pick Haystack when they need predictable production pipelines.
What's the difference between Haystack 1.x and 2.x?
Haystack 2.x (released 2024) is a ground-up rewrite with a cleaner component model, typed connections, serialisation, and better agent support. 1.x is in maintenance mode — new apps should start on 2.x.
Sources
- Haystack — docs — accessed 2026-04-20
- Haystack — GitHub — accessed 2026-04-20