# DeepSeek TUI

> A terminal-native coding agent for DeepSeek models, built in Rust with 1M-token context, thinking-mode streaming, MCP client, sandbox, and durable task queue.

DeepSeek TUI is a terminal-native coding agent built for DeepSeek V4 models, shipping as self-contained Rust binaries with no Node.js or Python runtime required. It gives DeepSeek's frontier models direct access to your workspace — reading and editing files, running shell commands, searching the web, managing git, and orchestrating sub-agents — all through a fast, keyboard-driven TUI. It supports a 1M-token context window, native thinking-mode (chain-of-thought) streaming, and prefix-cache-aware cost efficiency.

Key features include:

- **Native RLM (`rlm_query`)** — *fans out 1–16 parallel `deepseek-v4-flash` children for batched analysis and parallel reasoning*
- **Thinking-mode streaming** — *watch the model's chain-of-thought unfold in real time as it works through tasks*
- **Full tool suite** — *file ops, shell execution, git, web search/browse, apply-patch, sub-agents, and MCP servers*
- **1M-token context** — *automatic intelligent compaction when context fills up; prefix-cache aware for cost efficiency*
- **Three modes** — *Plan (read-only explore), Agent (interactive with approval), and YOLO (auto-approved)*
- **Reasoning-effort tiers** — *cycle through `off → high → max` with Shift+Tab*
- **Session save/resume** — *checkpoint and resume long-running sessions by UUID or fork at a chosen turn*
- **Workspace rollback** — *side-git pre/post-turn snapshots with `/restore` and `revert_turn` without touching your repo's `.git`*
- **Durable task queue** — *background tasks survive restarts for scheduled automation and long-running reviews*
- **HTTP/SSE runtime API** — *`deepseek serve --http` for headless agent workflows*
- **MCP protocol** — *connect to Model Context Protocol servers for extended tooling*
- **LSP diagnostics** — *inline error/warning surfacing after every edit via rust-analyzer, pyright, typescript-language-server, gopls, and clangd*
- **Skills system** — *composable, installable instruction packs from GitHub with no backend service required*
- **Live cost tracking** — *per-turn and session-level token usage, cost estimates, and cache hit/miss breakdown*
- **Localized UI** — *supports `en`, `ja`, `zh-Hans`, `pt-BR` with auto-detection*
- **Multiple install paths** — *npm, Cargo, Homebrew, Scoop, or direct binary download for Linux x64/ARM64, macOS x64/ARM64, and Windows x64*

## Features
- Terminal-native TUI coding agent
- 1M-token context window
- Thinking-mode (chain-of-thought) streaming
- Native RLM parallel reasoning with up to 16 sub-agents
- Plan / Agent / YOLO modes
- File ops, shell execution, git, web search/browse tools
- MCP (Model Context Protocol) client
- LSP diagnostics integration
- Session save, resume, and fork
- Workspace rollback via side-git snapshots
- Durable background task queue
- HTTP/SSE runtime API server
- Skills system with GitHub-installable instruction packs
- Live cost tracking with cache hit/miss breakdown
- Localized UI (en, ja, zh-Hans, pt-BR)
- Multiple API provider support (DeepSeek, NVIDIA NIM, Fireworks, SGLang)
- Prefix-cache aware for cost efficiency
- Reasoning-effort tiers (off / high / max)
- User memory for cross-session preferences

## Integrations
DeepSeek API, NVIDIA NIM, Fireworks AI, SGLang (self-hosted), Model Context Protocol (MCP) servers, rust-analyzer, pyright, typescript-language-server, gopls, clangd, npm, Cargo, Homebrew, Scoop, GitHub

## Platforms
WINDOWS, MACOS, LINUX, API, VSC_EXTENSION, CLI

## Pricing
Open Source

## Version
v0.8.14

## Links
- Website: https://github.com/Hmbown/DeepSeek-TUI
- Documentation: https://github.com/Hmbown/DeepSeek-TUI/tree/main/docs
- Repository: https://github.com/Hmbown/DeepSeek-TUI
- EveryDev.ai: https://www.everydev.ai/tools/deepseek-tui
