# Atomic Chat > Open-source ChatGPT alternative that lets you run local LLMs or connect to cloud AI models with full privacy and control. Atomic Chat is an open-source desktop application that serves as a privacy-first alternative to ChatGPT, built with Tauri and TypeScript. It lets you download and run large language models locally on your machine or connect to cloud-based AI providers like OpenAI, Anthropic, and Mistral. All local inference runs entirely offline, giving you complete control over your data. The app also exposes an OpenAI-compatible local API server and supports the Model Context Protocol for agentic workflows. - **Local AI Models** — *Download and run LLMs (Llama, Gemma, Qwen, and more) directly from HuggingFace without any cloud dependency.* - **Cloud Integration** — *Connect to OpenAI, Anthropic, Mistral, Groq, MiniMax, and other cloud providers from a single interface.* - **Custom Assistants** — *Create specialized AI assistants tailored to specific tasks or workflows.* - **OpenAI-Compatible API** — *A local server runs at `localhost:1337`, making Atomic Chat a drop-in backend for other applications.* - **Model Context Protocol (MCP)** — *Built-in MCP integration enables agentic capabilities and tool use.* - **Privacy First** — *All processing happens locally when using local models; no data leaves your machine.* - **Built with Tauri** — *Lightweight, cross-platform desktop app built on Rust and web technologies for a native feel.* - **Open Source (Apache 2.0)** — *Full source code available on GitHub; contributions and forks are welcome.* To get started, download the macOS `.dmg` from the GitHub Releases page or the official website, install the app, and either pull a model from HuggingFace or enter your API key for a cloud provider. Developers can build from source using Node.js ≥ 20, Yarn ≥ 4.5.3, and Rust (for Tauri) by running `make dev` after cloning the repository. ## Features - Run local LLMs offline - Cloud AI provider integration (OpenAI, Anthropic, Mistral, Groq, MiniMax) - Custom AI assistants - OpenAI-compatible local API at localhost:1337 - Model Context Protocol (MCP) support - Privacy-first local inference - HuggingFace model downloads - Built with Tauri for native desktop experience - Open source under Apache 2.0 ## Integrations OpenAI, Anthropic, Mistral, Groq, MiniMax, HuggingFace, Llama.cpp, Tauri, Model Context Protocol (MCP) ## Platforms WINDOWS, MACOS, API, CLI ## Pricing Open Source ## Version v1.1.29 ## Links - Website: https://atomic.chat - Documentation: https://github.com/AtomicBot-ai/Atomic-Chat - Repository: https://github.com/AtomicBot-ai/Atomic-Chat - EveryDev.ai: https://www.everydev.ai/tools/atomic-chat