AgenticSeek
A fully local, private AI agent that autonomously browses the web, writes and runs code, and plans complex tasks — with zero cloud dependency.
At a Glance
Fully free and open-source under GPL-3.0. Self-host on your own hardware with no cost beyond electricity.
Engagement
Available On
Alternatives
Listed May 2026
About AgenticSeek
AgenticSeek is a 100% local alternative to Manus AI, designed to run entirely on your own hardware with no cloud dependency or API costs. It combines autonomous web browsing, code generation, task planning, and voice interaction into a single self-hosted AI assistant. All data stays on your device, ensuring complete privacy. The project is open-source under GPL-3.0 and supports both local LLM providers (Ollama, LM Studio) and optional cloud APIs.
- Fully Local & Private: Everything runs on your machine — no cloud, no data sharing. Files, conversations, and searches never leave your device.
- Smart Web Browsing: AgenticSeek autonomously searches the web, reads pages, extracts information, and fills forms hands-free using a stealth-mode browser.
- Autonomous Coding Assistant: Writes, debugs, and runs code in Python, C, Go, Java, and more without supervision.
- Smart Agent Routing: Automatically selects the best agent for each task — web search, coding, file management, or planning — based on your query.
- Multi-Step Task Planning: Breaks complex goals into steps and executes them using multiple coordinated AI agents.
- Voice-Enabled Interaction: Supports speech-to-text input and text-to-speech output, letting you interact with the assistant like a sci-fi AI (CLI mode).
- Flexible LLM Support: Works with local providers (Ollama, LM Studio, llama.cpp) and cloud APIs (OpenAI, Google Gemini, Deepseek, HuggingFace, TogetherAI, OpenRouter, MiniMax).
- Docker-Based Deployment: Runs via Docker Compose with a web interface at localhost:3000, or in CLI mode with host installation.
- Session Memory: Optionally saves and recovers previous sessions for continuity across restarts.
- Self-Hosted LLM Server: Supports offloading LLM inference to a remote machine on your local network for resource-constrained laptops.
Community Discussions
Be the first to start a conversation about AgenticSeek
Share your experience with AgenticSeek, ask questions, or help others learn from your insights.
Pricing
Open Source (GPL-3.0)
Fully free and open-source under GPL-3.0. Self-host on your own hardware with no cost beyond electricity.
- Fully local execution
- Autonomous web browsing
- Code generation and execution
- Multi-agent task planning
- Voice input/output
Capabilities
Key Features
- Fully local and private execution
- Autonomous web browsing and form filling
- Code writing, debugging, and execution
- Smart agent routing
- Multi-step task planning
- Voice input (speech-to-text) and output (text-to-speech)
- Support for Ollama, LM Studio, llama.cpp
- Optional cloud API support (OpenAI, Gemini, Deepseek, etc.)
- Docker-based deployment with web UI
- CLI mode
- Session save and recovery
- Self-hosted remote LLM server support
- Stealth browser mode
- Multi-language support
