Honcho
An open source memory library with a managed service for building stateful AI agents that can maintain and reason about any entity over time.
At a Glance
Fully open source under AGPL-3.0. Self-host Honcho for free using Docker or Fly.io.
Engagement
Available On
Listed May 2026
About Honcho
Honcho is an open source memory library and managed service for building stateful AI agents. It enables agents to build and maintain state about any entity—users, agents, groups, ideas, and more—using a continual learning system that understands entities as they change over time. Honcho works with any model, framework, or architecture, and exposes a rich reasoning pipeline that asynchronously derives insights from interactions to personalize agent behavior.
- Peer Paradigm: Both users and AI agents are represented as "peers," enabling multi-participant sessions with mixed human and AI agents and flexible identity management.
- Memory & Storage Primitives: Workspaces, Peers, Sessions, Messages, Collections, and Documents provide a structured hierarchy for storing and retrieving all application and conversation state.
- Reasoning Pipeline: Asynchronously derives facts and representations about peers from messages, building comprehensive psychological models that inform agent responses.
- Chat (Dialectic) API: A natural language endpoint (
/peers/{peer_id}/chat) lets developers query Honcho as an oracle about any peer—hydrating prompts, getting personalized insights, or requesting 2nd opinions. - Context Endpoint: Returns a token-limited combination of messages, conclusions, and summaries to keep long-running LLM conversations going indefinitely.
- Hybrid Search: Query messages at the Workspace, Session, or Peer level using hybrid search with advanced filters.
- Representations: Low-latency static documents summarizing peer insights for quick prompt augmentation without waiting for an LLM response.
- Multi-Provider LLM Support: Configurable backends for Google Gemini, Anthropic, and OpenAI for different reasoning tasks.
- Self-Hostable: Full local development support via Docker Compose, Fly.io deployment, and a FastAPI server with PostgreSQL/pgvector.
- Python & TypeScript SDKs: Install via
pip install honcho-aiornpm install @honcho-ai/sdkand get started with a few lines of code.
Community Discussions
Be the first to start a conversation about Honcho
Share your experience with Honcho, ask questions, or help others learn from your insights.
Pricing
Open Source (Self-Hosted)
Fully open source under AGPL-3.0. Self-host Honcho for free using Docker or Fly.io.
- Full Honcho server source code
- Python and TypeScript SDKs
- Docker Compose support
- Fly.io deployment support
- All storage and reasoning features
Managed Cloud
Managed hosted service at app.honcho.dev. Sign up and get $100 free credits to start.
- Dedicated hosted Honcho instance
- $100 free credits on signup
- API key provisioning
- Managed infrastructure
- Access to all Honcho features
Capabilities
Key Features
- Open source memory library (AGPL-3.0)
- Managed cloud service at app.honcho.dev
- Peer-centric entity model for users and agents
- Asynchronous reasoning and representation pipeline
- Chat (Dialectic) API for natural language peer queries
- Context endpoint for token-limited conversation history
- Hybrid search across Workspace, Session, and Peer levels
- Collections and Documents for RAG applications
- Session summarization
- Multi-provider LLM support (Gemini, Anthropic, OpenAI)
- Self-hostable via Docker Compose or Fly.io
- Python and TypeScript SDKs
- pgvector, turbopuffer, and lancedb vector store support
- JWT-based authentication
- Prometheus metrics and CloudEvents telemetry
