LangGraph
Low-level orchestration framework from LangChain for building stateful, long-running agents with durable execution, human-in-the-loop control, and first-class streaming.
At a Glance
LangGraph is MIT-licensed and free to use forever in any environment, including production. No usage limits, no seat counts, no telemetry required. The optional LangGraph Platform / LangSmith Deployment is a separate paid hosted service for teams who want managed deployment infrastructure.
Engagement
Available On
Alternatives
Listed May 2026
About LangGraph
LangGraph is LangChain's low-level orchestration framework and runtime for building, managing, and deploying long-running, stateful agents. Rather than wrapping LLM calls in rigid abstractions, it gives developers explicit graph primitives — nodes, edges, and shared state — so agent control flow is something you write rather than something the framework hides. It powers production agent systems at companies including Klarna, Uber, LinkedIn, Replit, GitLab, and J.P. Morgan, and reached its 1.0 stable release in October 2025.
The framework ships in both Python (langgraph on PyPI) and TypeScript (@langchain/langgraph on npm), with the JavaScript implementation tracking slightly behind Python on newer features. LangChain's high-level create_agent API is built directly on top of LangGraph's runtime, so teams can start with prebuilt agent patterns and drop down to LangGraph primitives whenever they need fine-grained control.
- Durable Execution — Persists agent state to a checkpointer so workflows survive crashes, timeouts, and process restarts, resuming from exactly where they left off rather than starting over.
- Human-in-the-Loop Interrupts — Pause execution at any node to surface tool calls, edits, or approvals to a human, then resume the graph from that exact state once the human responds.
- Low-Level Primitives — Build agents from
StateGraph, nodes, and edges with no forced agent abstractions; you write the control flow as code rather than configuring a black box. - First-Class Streaming — Native token-by-token streaming plus structured streaming of node outputs, intermediate state, and custom events while the graph is running.
- Comprehensive Memory — Short-term working memory via thread-scoped state and long-term cross-session memory via the
StoreAPI with checkpointers like Postgres, SQLite, or in-memory. - Time Travel — Rewind to any prior checkpoint, edit state, and re-run the graph from that point — useful for debugging non-deterministic agent behavior.
- Subgraphs and Multi-Agent Patterns — Compose graphs inside nodes to build hierarchical, supervisor, and swarm topologies, with companion packages
@langchain/langgraph-supervisorand@langchain/langgraph-swarm. - Functional API — An alternative to the graph DSL using
entrypointandtaskdecorators, letting you write agents as regular async functions while still getting durability and streaming. - LangSmith Integration — Trace every node execution, state transition, and LLM call automatically by setting
LANGSMITH_TRACING=true, with no code changes to the graph. - LangGraph Platform — Optional managed deployment service (LangSmith Deployment) with auto-scaling task queues, persistent storage, cron jobs, and a visual Studio for prototyping; the framework itself runs anywhere without it.
LangGraph is MIT-licensed and free to use forever. The paid LangSmith Deployment / LangGraph Platform is a separate, optional product for hosted agent infrastructure. The framework draws conceptual inspiration from Google's Pregel paper and Apache Beam, with a public interface modeled on NetworkX. It pairs naturally with LangChain for model and tool integrations and with LangSmith for evaluation and observability, but works standalone with any LLM provider or framework.
Community Discussions
Be the first to start a conversation about LangGraph
Share your experience with LangGraph, ask questions, or help others learn from your insights.
Pricing
Open Source
LangGraph is MIT-licensed and free to use forever in any environment, including production. No usage limits, no seat counts, no telemetry required. The optional LangGraph Platform / LangSmith Deployment is a separate paid hosted service for teams who want managed deployment infrastructure.
- Full Python and TypeScript framework
- Durable execution and checkpointing
- Human-in-the-loop interrupts
- Streaming and time travel
- Subgraphs and multi-agent patterns
Capabilities
Key Features
- Graph-based agent orchestration with nodes, edges, and shared state
- Durable execution with automatic resume from checkpoints
- Human-in-the-loop interrupts for inspection, edits, and approvals
- Native token-by-token and structured event streaming
- Short-term thread memory and long-term cross-session memory store
- Time travel debugging via checkpoint rewind and replay
- Subgraphs for hierarchical and multi-agent compositions
- Functional API as alternative to graph DSL
- Python and TypeScript implementations with shared concepts
- Built-in LangSmith tracing via single environment variable
- Prebuilt ReAct agent and tool node helpers
- Pluggable checkpointers for Postgres, SQLite, and in-memory state
- Powers LangChain's high-level create_agent API
- Optional LangGraph Platform for managed deployment
