LangMem
Open-source SDK from LangChain for long-term memory in LLM agents, with hot-path tools, a background memory manager, and native LangGraph storage integration.
At a Glance
Pricing
Get started with LangMem at no cost with Python package via PyPI and All core SDK features.
Engagement
Available On
About LangMem
LangMem is a Python SDK for adding long-term memory to LLM agents. It provides two complementary paths:
- Hot path tools (create_manage_memory_tool, create_search_memory_tool) that agents call during conversations to store and retrieve facts, preferences, and other context.
- Background manager that periodically extracts, consolidates, and updates memories outside the request path.
The core API is storage-agnostic and works with LangGraph’s BaseStore implementations (e.g., in-memory for dev, Postgres for production). It supports semantic and episodic memory patterns, prompt optimization/“procedural memory,” and dynamic namespaces for user- or team-scoped memories. Typical setups pair LangMem with LangGraph agents, Postgres/pgvector for persistence, and common model providers (Anthropic, OpenAI).
Install: pip install -U langmem Requires: Python 3.10+
Use it to build agents that remember user preferences across sessions, keep evolving profiles, and refine prompts over time—without hand-rolling memory extraction, deduplication, and retrieval plumbing.
Community Discussions
Be the first to start a conversation about LangMem
Share your experience with LangMem, ask questions, or help others learn from your insights.
Pricing
Free Plan Available
Get started with LangMem at no cost with Python package via PyPI and All core SDK features.
- Python package via PyPI
- All core SDK features
- Self-hosted storage (in-memory, Postgres/pgvector)
Managed memory service (invite/beta)
Managed memory service (invite/beta) plan with Hosted long-term memory store and LangGraph-native integration.
- Hosted long-term memory store
- LangGraph-native integration
Capabilities
Key Features
- Hot-path memory tools for store/search within agent runs
- Background memory manager for batch extraction and consolidation
- Storage-agnostic core API; works with LangGraph BaseStore
- Native integrations for InMemoryStore and AsyncPostgresStore
- Dynamic namespaces for user/team-scoped memories
- Supports semantic, episodic, and procedural (prompt) memory patterns
- Embeddings-based retrieval (e.g., OpenAI text-embedding-3-small)
- Typed/structured memories via Pydantic models
- CrewAI and custom-agent usage guides
- MIT-licensed; Python package via PyPI
Integrations
Demo Video

