# Pieces > AI-powered desktop app for developers that captures workflow context, builds on-device long-term memory, and integrates with IDEs, browsers, CLIs, and local LLMs for context-aware coding. Pieces is an AI-powered productivity platform for developers that captures and resurfaces workflow context so you never lose track of what you were working on. It provides a desktop application with on-device long-term memory, local LLM support, and integrations across IDEs, browsers, and command-line tools. Pieces also exposes a Model Context Protocol (MCP) to connect external AI tools to its shared context engine. - **Desktop app** — The flagship app captures workflow events, preserves context, and surfaces relevant snippets during development. Available for macOS, Windows, and Linux. - **On-device long-term memory** — Structured, privacy-preserving memory stored locally so your context and chat history remain on your machine. - **Model Context Protocol (MCP)** — Connect external tools and models to the Pieces memory engine for shared contextual intelligence. - **IDE & editor plugins** — Extensions for VS Code, JetBrains IDEs, Visual Studio, Sublime Text, Neovim, and JupyterLab bring context-aware code search and generation inside your editor. - **Browser & productivity integrations** — Browser extensions and integrations for Obsidian, Raycast, and the Pieces CLI extend context capture across your workflow. - **Local LLM support** — Integrates with on-device models such as Ollama for private, performant inference without sending data to the cloud. To get started, download the desktop app, install relevant plugins, and enable local model or MCP integrations as needed. ## Features - Desktop app with AI-powered code management, search, and generation - On-device long-term memory engine for contextual recall - Model Context Protocol (MCP) for external tool integrations - Official IDE plugins (VS Code, JetBrains, Visual Studio, Sublime, Neovim, JupyterLab) - Browser extensions for major browsers - CLI and productivity tool integrations (Raycast, Obsidian) - Local LLM support via Ollama and other on-device models ## Integrations GitHub Copilot, Cursor, Goose, Claude Desktop, VS Code, JetBrains IDEs, Visual Studio, Sublime Text, Neovim, JupyterLab, Obsidian, Raycast, Ollama (local LLM support) ## Platforms WINDOWS, MACOS, LINUX ## Pricing Freemium — Free tier available with paid upgrades ## Links - Website: https://pieces.app - Documentation: https://docs.pieces.app/ - Repository: https://github.com/pieces-app/support - EveryDev.ai: https://www.everydev.ai/tools/pieces