# OpenWarp

> Community fork of the Warp terminal that opens up the AI layer with Bring Your Own Provider (BYOP) for any OpenAI-compatible endpoint.

OpenWarp is a community fork of the Warp terminal maintained by zerx-lab that opens up Warp's AI layer. It keeps the full Warp terminal experience — blocks, workflows, keymaps — while letting users plug in any OpenAI Chat Completions-compatible provider, customize system prompts with minijinja templates, and keep every credential local.

The official Warp client routes AI through Warp's cloud agent service. OpenWarp opens that layer entirely so users own the stack end-to-end.

- **Bring Your Own Provider (BYOP)** — *Paste a Base URL and API key in settings. Any OpenAI Chat Completions-compatible endpoint works out of the box, including OpenAI, DeepSeek, Qwen, Groq, Together, local Ollama / LM Studio, and most proxy gateways.*
- **Local credential storage** — *API keys and authentication tokens never leave the machine. They are stored locally in the configuration file and requests go directly to the configured provider with no relay.*
- **Native multi-protocol support** — *Six native API protocols via the genai adapter, implemented natively rather than as a shim, preserving protocol-specific features like DeepSeek's reasoning_content or Claude's thinking capabilities.*
- **Dynamic prompt templates** — *A minijinja template engine renders the system prompt in real time based on the current working directory, locale, and role.*
- **First-class internationalization** — *Ships with English and Chinese locales, with community-extensible locale support.*
- **Upstream sync** — *Continuously merges from upstream Warp while layering BYOP and i18n on top.*
- **Familiar Warp UX** — *Switch models, conversations, and command suggestions with one click. The terminal experience is identical to Warp.*

OpenWarp is built in Rust and distributed as source, with installers for macOS and Windows. It is dual-licensed under AGPL-3.0 and MIT (matching upstream Warp: warpui_core / warpui crates under MIT, everything else under AGPL-3.0). It is not affiliated with Warp Inc.

## Features
- Bring Your Own Provider (BYOP) for any OpenAI-compatible endpoint
- 100% local credential storage with no cloud relay
- Native support for six API protocols via the genai adapter
- Minijinja-powered dynamic system prompt templates
- Context-aware prompts based on cwd, locale, and role
- Continuous upstream sync with Warp
- First-class English and Chinese locales
- Built in Rust with cargo build workflow
- macOS and Windows installers
- AGPL-3.0 / MIT dual license

## Integrations
OpenAI, DeepSeek, Qwen, Groq, Together AI, Ollama, LM Studio, Anthropic Claude

## Platforms
MACOS, WINDOWS, LINUX

## Pricing
Open Source

## Links
- Website: https://openwarp.zerx.dev/en
- Repository: https://github.com/zerx-lab/warp/tree/openWarp
- EveryDev.ai: https://www.everydev.ai/tools/openwarp
