Main Menu
  • Tools
  • Developers
  • Topics
  • Discussions
  • Communities
  • News
  • Blogs
  • Builds
  • Contests
  • Compare
  • Arena
Create
    EveryDev.ai
    Sign inSubscribe
    Home
    Tools

    2,209+ AI tools

    • New
    • Trending
    • Featured
    • Compare
    • Arena
    Categories
    • Agents1228
    • Coding1045
    • Infrastructure455
    • Marketing414
    • Design374
    • Projects340
    • Analytics319
    • Research306
    • Testing200
    • Data171
    • Integration169
    • Security169
    • MCP164
    • Learning146
    • Communication131
    • Prompts122
    • Extensions120
    • Commerce116
    • Voice107
    • DevOps92
    • Web73
    • Finance19
    1. Home
    2. Tools
    3. DeepSeek TUI
    DeepSeek TUI icon

    DeepSeek TUI

    AI Coding Assistants

    A terminal-native coding agent for DeepSeek models, built in Rust with 1M-token context, thinking-mode streaming, MCP client, sandbox, and durable task queue.

    Visit Website

    At a Glance

    Pricing
    Open Source

    Fully free and open-source under the MIT License. Requires your own DeepSeek API key.

    Engagement

    Available On

    Windows
    macOS
    Linux
    API
    VS Code

    Resources

    WebsiteDocsGitHubllms.txt

    Topics

    AI Coding AssistantsCommand Line AssistantsMCP Integration

    Alternatives

    StudCrushBeehive
    Developer
    HmbownHmbown builds DeepSeek TUI, a terminal-native coding agent f…

    Listed May 2026

    About DeepSeek TUI

    DeepSeek TUI is a terminal-native coding agent built for DeepSeek V4 models, shipping as self-contained Rust binaries with no Node.js or Python runtime required. It gives DeepSeek's frontier models direct access to your workspace — reading and editing files, running shell commands, searching the web, managing git, and orchestrating sub-agents — all through a fast, keyboard-driven TUI. It supports a 1M-token context window, native thinking-mode (chain-of-thought) streaming, and prefix-cache-aware cost efficiency.

    Key features include:

    • Native RLM (rlm_query) — fans out 1–16 parallel deepseek-v4-flash children for batched analysis and parallel reasoning
    • Thinking-mode streaming — watch the model's chain-of-thought unfold in real time as it works through tasks
    • Full tool suite — file ops, shell execution, git, web search/browse, apply-patch, sub-agents, and MCP servers
    • 1M-token context — automatic intelligent compaction when context fills up; prefix-cache aware for cost efficiency
    • Three modes — Plan (read-only explore), Agent (interactive with approval), and YOLO (auto-approved)
    • Reasoning-effort tiers — cycle through off → high → max with Shift+Tab
    • Session save/resume — checkpoint and resume long-running sessions by UUID or fork at a chosen turn
    • Workspace rollback — side-git pre/post-turn snapshots with /restore and revert_turn without touching your repo's .git
    • Durable task queue — background tasks survive restarts for scheduled automation and long-running reviews
    • HTTP/SSE runtime API — deepseek serve --http for headless agent workflows
    • MCP protocol — connect to Model Context Protocol servers for extended tooling
    • LSP diagnostics — inline error/warning surfacing after every edit via rust-analyzer, pyright, typescript-language-server, gopls, and clangd
    • Skills system — composable, installable instruction packs from GitHub with no backend service required
    • Live cost tracking — per-turn and session-level token usage, cost estimates, and cache hit/miss breakdown
    • Localized UI — supports en, ja, zh-Hans, pt-BR with auto-detection
    • Multiple install paths — npm, Cargo, Homebrew, Scoop, or direct binary download for Linux x64/ARM64, macOS x64/ARM64, and Windows x64
    DeepSeek TUI - 1

    Community Discussions

    Be the first to start a conversation about DeepSeek TUI

    Share your experience with DeepSeek TUI, ask questions, or help others learn from your insights.

    Pricing

    OPEN SOURCE

    Open Source (Free)

    Fully free and open-source under the MIT License. Requires your own DeepSeek API key.

    • All TUI features
    • 1M-token context window
    • Thinking-mode streaming
    • MCP client
    • LSP diagnostics

    Capabilities

    Key Features

    • Terminal-native TUI coding agent
    • 1M-token context window
    • Thinking-mode (chain-of-thought) streaming
    • Native RLM parallel reasoning with up to 16 sub-agents
    • Plan / Agent / YOLO modes
    • File ops, shell execution, git, web search/browse tools
    • MCP (Model Context Protocol) client
    • LSP diagnostics integration
    • Session save, resume, and fork
    • Workspace rollback via side-git snapshots
    • Durable background task queue
    • HTTP/SSE runtime API server
    • Skills system with GitHub-installable instruction packs
    • Live cost tracking with cache hit/miss breakdown
    • Localized UI (en, ja, zh-Hans, pt-BR)
    • Multiple API provider support (DeepSeek, NVIDIA NIM, Fireworks, SGLang)
    • Prefix-cache aware for cost efficiency
    • Reasoning-effort tiers (off / high / max)
    • User memory for cross-session preferences

    Integrations

    DeepSeek API
    NVIDIA NIM
    Fireworks AI
    SGLang (self-hosted)
    Model Context Protocol (MCP) servers
    rust-analyzer
    pyright
    typescript-language-server
    gopls
    clangd
    npm
    Cargo
    Homebrew
    Scoop
    GitHub
    API Available
    View Docs

    Reviews & Ratings

    No ratings yet

    Be the first to rate DeepSeek TUI and help others make informed decisions.

    Developer

    Hmbown

    Hmbown builds DeepSeek TUI, a terminal-native coding agent for DeepSeek models written in Rust. The project ships self-contained binaries with no runtime dependencies and supports multiple platforms including Linux, macOS, and Windows. It integrates with the DeepSeek API and compatible providers, offering advanced features like MCP protocol support, LSP diagnostics, and a durable task queue.

    Read more about Hmbown
    WebsiteGitHub
    1 tool in directory

    Similar Tools

    Stud icon

    Stud

    Open-source AI coding assistant built for Roblox developers with deep Studio integration for scripts, instances, and DataStores.

    Crush icon

    Crush

    An AI-powered coding assistant CLI that connects your tools, code, and workflows to your choice of LLM.

    Beehive icon

    Beehive

    Manage multiple GitHub repos, create isolated git workspaces, and run coding agents side-by-side — all from one desktop or terminal window.

    Browse all tools

    Related Topics

    AI Coding Assistants

    AI tools that help write, edit, and understand code with intelligent suggestions.

    417 tools

    Command Line Assistants

    AI-powered command-line assistants that help developers navigate, search, and execute terminal commands with intelligent suggestions and context awareness.

    124 tools

    MCP Integration

    Tools for integrating MCP with existing AI systems and applications.

    48 tools
    Browse all topics
    Back to all tools
    Explore AI Tools
    • AI Coding Assistants
    • Agent Frameworks
    • MCP Servers
    • AI Prompt Tools
    • Vibe Coding Tools
    • AI Design Tools
    • AI Database Tools
    • AI Website Builders
    • AI Testing Tools
    • LLM Evaluations
    Follow Us
    • X / Twitter
    • LinkedIn
    • Reddit
    • Discord
    • Threads
    • Bluesky
    • Mastodon
    • YouTube
    • GitHub
    • Instagram
    Get Started
    • About
    • Editorial Standards
    • Corrections & Disclosures
    • Community Guidelines
    • Advertise
    • Contact Us
    • Newsletter
    • Submit a Tool
    • Start a Discussion
    • Write A Blog
    • Share A Build
    • Terms of Service
    • Privacy Policy
    Explore with AI
    • ChatGPT
    • Gemini
    • Claude
    • Grok
    • Perplexity
    Agent Experience
    • llms.txt
    Theme
    With AI, Everyone is a Dev. EveryDev.ai © 2026
    Discussions