ReMe
A memory management toolkit for AI agents providing file-based and vector-based memory systems to solve limited context windows and stateless sessions.
At a Glance
About ReMe
ReMe (Remember Me, Refine Me) is an open-source memory management framework for AI agents, licensed under Apache 2.0. It tackles two core problems of agent memory: limited context windows (where early information is truncated in long conversations) and stateless sessions (where new sessions cannot inherit history). ReMe provides both a file-based memory system (ReMeLight) and a vector-based memory system, achieving state-of-the-art results on the LoCoMo and HaluMem benchmarks.
- File-based memory (ReMeLight): Treats memory as readable, editable Markdown files stored in a structured directory — no opaque databases, easy to inspect and migrate.
- Context management: Automatically checks context size, compacts conversation history into structured summaries, and handles long tool outputs to prevent context overflow.
- Vector-based memory system: Manages personal, procedural, and tool memories using vector stores (local, Chroma, Qdrant, Elasticsearch, or OBVec) with semantic retrieval.
- Hybrid memory search: Combines vector search (weight 0.7) and BM25 keyword matching (weight 0.3) for balanced semantic and exact-match retrieval.
- Pre-reasoning hook: A unified entry point that wires all memory components together — compact tool results, check context, generate summaries, and persist memory asynchronously before each agent reasoning step.
- Persistent long-term memory: Uses a ReAct + file tools pattern so the AI decides what to write and where, persisting important information to daily Markdown journals.
- In-session memory (ReMeInMemoryMemory): Token-aware memory management with raw conversation persistence to JSONL files, supporting session state serialization and deserialization.
- Benchmark performance: Achieves 86.23% overall on LoCoMo and 88.78% QA accuracy on HaluMem, outperforming Mem0, MemOS, Zep, and other baselines.
- Easy installation: Install via
pip install reme-aior from source; configure LLM and embedding backends through environment variables. - Framework integrations: Integrates with AgentScope, QwenPaw (CoPaw), and supports OpenAI-compatible LLM APIs and multiple vector store backends.
Community Discussions
Be the first to start a conversation about ReMe
Share your experience with ReMe, ask questions, or help others learn from your insights.
Pricing
Open Source
Fully free and open-source under Apache License 2.0. Free to use, modify, and distribute.
- File-based memory system (ReMeLight)
- Vector-based memory system
- Context management and compaction
- Hybrid vector + BM25 memory search
- Pre-reasoning hook
Capabilities
Key Features
- File-based memory system (ReMeLight) using Markdown files
- Vector-based memory system with personal, procedural, and tool memory types
- Context window management with automatic compaction
- Hybrid vector + BM25 memory search
- Pre-reasoning hook for automated context management
- Long-term memory persistence to daily Markdown journals
- Tool result compaction to prevent context bloat
- In-session token-aware memory (ReMeInMemoryMemory)
- Async background memory summarization
- Support for local, Chroma, Qdrant, Elasticsearch, and OBVec vector stores
- State-of-the-art results on LoCoMo and HaluMem benchmarks
- Apache 2.0 open-source license
