Hindsight
Hindsight is an open-source agent memory system that enables AI agents to learn over time using biomimetic data structures for state-of-the-art long-term memory performance.
At a Glance
About Hindsight
Hindsight™ is an open-source agent memory system built to create smarter AI agents that learn over time, not just remember conversation history. It uses biomimetic data structures to organize memories similarly to how human memory works, achieving state-of-the-art performance on the LongMemEval benchmark. Hindsight can be integrated with just 2 lines of code via an LLM wrapper, or through a flexible API with Python, Node.js, REST, and CLI SDKs. It supports self-hosted Docker deployments as well as a managed cloud offering.
- Retain Operation: Store information into Hindsight memory banks, where an LLM extracts key facts, temporal data, entities, and relationships automatically.
- Recall Operation: Retrieve memories using 4 parallel strategies — semantic vector search, BM25 keyword matching, graph-based entity/temporal links, and time range filtering — merged via reciprocal rank fusion and cross-encoder reranking.
- Reflect Operation: Perform deep analysis of existing memories to form new connections, build mental models, and generate insights from accumulated experiences.
- Biomimetic Memory Architecture: Organizes memories into World Facts, Experiences, and Mental Models pathways, with entity, relationship, and time-series representations.
- LLM Wrapper Integration: Add memory to any existing agent with 2 lines of code by swapping your LLM client for the Hindsight wrapper.
- Multi-Provider LLM Support: Works with OpenAI, Anthropic, Gemini, Groq, Ollama, LMStudio, and Minimax as the underlying LLM provider.
- Per-User Memory Isolation: Use custom metadata to isolate and filter memories per user, enabling personalized AI chatbots and conversational agents.
- Python Embedded Mode: Run Hindsight in-process without a separate server using the
hindsight-allpackage. - Docker Deployment: Self-host with a single Docker command or use Docker Compose with an external PostgreSQL database.
- Hindsight Cloud: Managed cloud version available at ui.hindsight.vectorize.io for teams that prefer not to self-host.
Community Discussions
Be the first to start a conversation about Hindsight
Share your experience with Hindsight, ask questions, or help others learn from your insights.
Pricing
Open Source (Self-Hosted)
Fully open-source under MIT License, free to self-host via Docker or Python embedded mode.
- Retain, Recall, Reflect operations
- Docker deployment
- Python embedded mode
- Python and Node.js SDKs
- REST API and CLI
Capabilities
Key Features
- Agent memory with learning over time
- Retain, Recall, and Reflect operations
- Biomimetic memory architecture (World, Experiences, Mental Models)
- 4-strategy parallel recall (semantic, keyword, graph, temporal)
- LLM wrapper for 2-line integration
- Multi-provider LLM support (OpenAI, Anthropic, Gemini, Groq, Ollama, LMStudio, Minimax)
- Per-user memory isolation via metadata
- Python and Node.js SDKs
- REST API and CLI
- Docker self-hosting
- Python embedded mode (no server required)
- Hindsight Cloud managed offering
- State-of-the-art LongMemEval benchmark performance
- Memory banks for organizing memories
