9 open-source AI agent memory frameworks compared. Including Hindsight (Vectorize.io), agent-memory, Letta, Mem0, Zep, MemOS, and more.
| Framework | MCP | Encryption | TTL | Local | License | Key Feature |
|---|---|---|---|---|---|---|
| ★ agent-memory | Native | AES-256 | Yes | 100% | MIT | Universal MCP + encryption |
| Hindsight | Native | No | Adaptive | Docker local | MIT | Knowledge graph + reflect |
| Letta | Via MCP | Yes | Yes | Partial | MIT | Stateful LLM apps |
| Mem0 | Yes | No | Limited | Cloud | Open Core | Teams, cloud-first |
| Zep | Yes | Yes | Yes | Cloud | Proprietary | Production AI apps |
| MemOS | OpenClaw | No | No | Cloud+local | AGPL | OpenClaw ecosystem |
| memU | OpenClaw | No | No | Cloud+local | AGPL | 24/7 proactive agents |
| LangGraph Memory | No | No | Custom | Custom | MIT | LangChain integration |
| Redis InMemoryStore | No | No | Yes (Redis) | Local | BSD | Fast, vector search |
Hindsight is an open-source MCP memory server from Vectorize.io. Key quote:
"Hindsight isn't a vector database. It extracts structured facts, resolves entities, builds a knowledge graph, and uses cross-encoder reranking to surface what actually matters. Three core operations: retain (store), recall (search), reflect (reason). Plus mental models — living documents that auto-update as memories grow." — Hindsight Blog, March 4, 2026
Key differentiators of Hindsight:
Knowledge graph + mental models. Reflect operation for auto-reasoning. Docker one-command setup. MIT license.
AES-256 encryption. TTL with configurable expiration. 100% local (no Docker). Universal MCP. JSON/SQLite/Redis backends. MIT license.
# Install agent-memory
pip install agent-memory
# Run MCP server (no Docker needed)
python -m agent_memory.mcp_server --storage json --path ./memory.json
# Connect any MCP client: OpenHands, Claude Code, Cursor, Cline, Goose