DeepSeek TUI
A terminal-native coding agent for DeepSeek models, built in Rust with 1M-token context, thinking-mode streaming, MCP client, sandbox, and durable task queue.
At a Glance
Fully free and open-source under the MIT License. Requires your own DeepSeek API key.
Engagement
Available On
Listed May 2026
About DeepSeek TUI
DeepSeek TUI is a terminal-native coding agent built for DeepSeek V4 models, shipping as self-contained Rust binaries with no Node.js or Python runtime required. It gives DeepSeek's frontier models direct access to your workspace — reading and editing files, running shell commands, searching the web, managing git, and orchestrating sub-agents — all through a fast, keyboard-driven TUI. It supports a 1M-token context window, native thinking-mode (chain-of-thought) streaming, and prefix-cache-aware cost efficiency.
Key features include:
- Native RLM (
rlm_query) — fans out 1–16 paralleldeepseek-v4-flashchildren for batched analysis and parallel reasoning - Thinking-mode streaming — watch the model's chain-of-thought unfold in real time as it works through tasks
- Full tool suite — file ops, shell execution, git, web search/browse, apply-patch, sub-agents, and MCP servers
- 1M-token context — automatic intelligent compaction when context fills up; prefix-cache aware for cost efficiency
- Three modes — Plan (read-only explore), Agent (interactive with approval), and YOLO (auto-approved)
- Reasoning-effort tiers — cycle through
off → high → maxwith Shift+Tab - Session save/resume — checkpoint and resume long-running sessions by UUID or fork at a chosen turn
- Workspace rollback — side-git pre/post-turn snapshots with
/restoreandrevert_turnwithout touching your repo's.git - Durable task queue — background tasks survive restarts for scheduled automation and long-running reviews
- HTTP/SSE runtime API —
deepseek serve --httpfor headless agent workflows - MCP protocol — connect to Model Context Protocol servers for extended tooling
- LSP diagnostics — inline error/warning surfacing after every edit via rust-analyzer, pyright, typescript-language-server, gopls, and clangd
- Skills system — composable, installable instruction packs from GitHub with no backend service required
- Live cost tracking — per-turn and session-level token usage, cost estimates, and cache hit/miss breakdown
- Localized UI — supports
en,ja,zh-Hans,pt-BRwith auto-detection - Multiple install paths — npm, Cargo, Homebrew, Scoop, or direct binary download for Linux x64/ARM64, macOS x64/ARM64, and Windows x64
Community Discussions
Be the first to start a conversation about DeepSeek TUI
Share your experience with DeepSeek TUI, ask questions, or help others learn from your insights.
Pricing
Open Source (Free)
Fully free and open-source under the MIT License. Requires your own DeepSeek API key.
- All TUI features
- 1M-token context window
- Thinking-mode streaming
- MCP client
- LSP diagnostics
Capabilities
Key Features
- Terminal-native TUI coding agent
- 1M-token context window
- Thinking-mode (chain-of-thought) streaming
- Native RLM parallel reasoning with up to 16 sub-agents
- Plan / Agent / YOLO modes
- File ops, shell execution, git, web search/browse tools
- MCP (Model Context Protocol) client
- LSP diagnostics integration
- Session save, resume, and fork
- Workspace rollback via side-git snapshots
- Durable background task queue
- HTTP/SSE runtime API server
- Skills system with GitHub-installable instruction packs
- Live cost tracking with cache hit/miss breakdown
- Localized UI (en, ja, zh-Hans, pt-BR)
- Multiple API provider support (DeepSeek, NVIDIA NIM, Fireworks, SGLang)
- Prefix-cache aware for cost efficiency
- Reasoning-effort tiers (off / high / max)
- User memory for cross-session preferences
