# OpenClaude > A fork of Claude Code that adds an OpenAI-compatible provider shim, letting you use GPT-4o, DeepSeek, Gemini, Llama, Mistral, or any OpenAI chat completions-compatible model. OpenClaude is a fork of the leaked Claude Code source that replaces Anthropic's proprietary LLM backend with an OpenAI-compatible provider shim. It lets developers use all of Claude Code's powerful agentic tools — bash, file read/write/edit, grep, glob, MCP, agents, tasks — powered by any model that speaks the OpenAI chat completions API. Supported providers include OpenAI (GPT-4o), DeepSeek, Google Gemini (via OpenRouter), Llama, Mistral, Azure OpenAI, Groq, Together AI, Ollama, LM Studio, and the ChatGPT Codex backend. The project is written in TypeScript and installable via npm or buildable from source using Bun. - **OpenAI-compatible shim** — *Set three environment variables (`CLAUDE_CODE_USE_OPENAI=1`, `OPENAI_API_KEY`, `OPENAI_MODEL`) and run `openclaude` to get started with any supported provider.* - **Full tool support** — *All Claude Code tools work out of the box: Bash, FileRead, FileWrite, FileEdit, Glob, Grep, WebFetch, WebSearch, Agent, MCP, LSP, NotebookEdit, and Tasks.* - **Streaming & multi-step tool chains** — *Real-time token streaming and multi-step tool calling are fully supported across all compatible models.* - **Local model support** — *Run entirely offline with Ollama or LM Studio; no API key required for local providers.* - **Codex backend support** — *`codexplan` and `codexspark` aliases map to ChatGPT Codex backend models; reads `~/.codex/auth.json` automatically if present.* - **Provider launch profiles** — *Use `bun run profile:init` to bootstrap a provider profile and `bun run dev:profile` to launch with persisted settings, avoiding repeated environment setup.* - **Runtime hardening tools** — *Built-in `doctor:runtime`, `smoke`, and `hardening:check` commands validate provider configuration and environment before launch.* - **Vision & image support** — *Base64 and URL images are passed through to vision-capable models.* - **Sub-agents & memory** — *AgentTool spawns sub-agents using the same configured provider; a persistent memory system is included.* - **Slash commands** — */commit, /review, /compact, /diff, /doctor, and more are all available.* ## Features - OpenAI-compatible provider shim - Support for GPT-4o, DeepSeek, Gemini, Llama, Mistral, and 200+ models - Full Claude Code tool support (Bash, FileRead, FileWrite, FileEdit, Glob, Grep, WebFetch, WebSearch, Agent, MCP, LSP, NotebookEdit, Tasks) - Real-time token streaming - Multi-step tool chains - Local model support via Ollama and LM Studio - ChatGPT Codex backend support (codexplan, codexspark) - Provider launch profiles - Runtime hardening and diagnostics (doctor:runtime, smoke, hardening:check) - Vision and image support (Base64 and URL) - Sub-agents via AgentTool - Persistent memory system - Slash commands (/commit, /review, /compact, /diff, /doctor) - npm and Bun install options - Azure OpenAI support ## Integrations OpenAI, DeepSeek, Google Gemini, Ollama, LM Studio, Together AI, Groq, Mistral, Azure OpenAI, OpenRouter, ChatGPT Codex, MCP, LSP ## Platforms WINDOWS, API, CLI ## Pricing Open Source ## Links - Website: https://github.com/Gitlawb/openclaude - Documentation: https://github.com/Gitlawb/openclaude - Repository: https://github.com/Gitlawb/openclaude - EveryDev.ai: https://www.everydev.ai/tools/openclaude