Creativity · Agent Protocol
mem0 — Agent Memory Layer
mem0 gives an AI agent persistent, per-user memory with roughly the interface you'd expect: `memory.add(...)`, `memory.search(...)`. Under the hood it extracts claims from conversations, merges them into a user-scoped store, and retrieves the most relevant facts to inject into future prompts. Popular with startups building consumer-facing assistants who don't want to build memory from scratch.
Protocol facts
- Sponsor
- Mem0
- Status
- stable
- Spec
- https://docs.mem0.ai/
- Interop with
- OpenAI, Anthropic, LangChain, LlamaIndex, LangGraph
Frequently asked questions
What's the core abstraction in mem0?
A `Memory` instance scoped by `user_id` and optional `session_id`. You call `add(messages)` after a turn; mem0 extracts facts. Before the next turn you call `search(query)` to retrieve the top-K relevant memories and inject them into the system prompt.
How does mem0 handle conflicting facts?
It uses an LLM to reconcile: when a new fact conflicts with an existing one, mem0 decides whether to update, add, or preserve, and logs the change history so you can audit.
Self-hosted or managed?
Both. The OSS library runs against your own vector store and LLM provider. Mem0 also offers a managed cloud service with additional features (analytics, role-based access).
Sources
- mem0 GitHub — accessed 2026-04-20
- mem0 docs — accessed 2026-04-20