OpenClaude
A fork of Claude Code that adds an OpenAI-compatible provider shim, letting you use GPT-4o, DeepSeek, Gemini, Llama, Mistral, or any OpenAI chat completions-compatible model.
At a Glance
Fully free and open-source; install via npm or build from source.
Engagement
Available On
Alternatives
Listed Apr 2026
About OpenClaude
OpenClaude is a fork of the leaked Claude Code source that replaces Anthropic's proprietary LLM backend with an OpenAI-compatible provider shim. It lets developers use all of Claude Code's powerful agentic tools — bash, file read/write/edit, grep, glob, MCP, agents, tasks — powered by any model that speaks the OpenAI chat completions API. Supported providers include OpenAI (GPT-4o), DeepSeek, Google Gemini (via OpenRouter), Llama, Mistral, Azure OpenAI, Groq, Together AI, Ollama, LM Studio, and the ChatGPT Codex backend. The project is written in TypeScript and installable via npm or buildable from source using Bun.
- OpenAI-compatible shim — Set three environment variables (
CLAUDE_CODE_USE_OPENAI=1,OPENAI_API_KEY,OPENAI_MODEL) and runopenclaudeto get started with any supported provider. - Full tool support — All Claude Code tools work out of the box: Bash, FileRead, FileWrite, FileEdit, Glob, Grep, WebFetch, WebSearch, Agent, MCP, LSP, NotebookEdit, and Tasks.
- Streaming & multi-step tool chains — Real-time token streaming and multi-step tool calling are fully supported across all compatible models.
- Local model support — Run entirely offline with Ollama or LM Studio; no API key required for local providers.
- Codex backend support —
codexplanandcodexsparkaliases map to ChatGPT Codex backend models; reads~/.codex/auth.jsonautomatically if present. - Provider launch profiles — Use
bun run profile:initto bootstrap a provider profile andbun run dev:profileto launch with persisted settings, avoiding repeated environment setup. - Runtime hardening tools — Built-in
doctor:runtime,smoke, andhardening:checkcommands validate provider configuration and environment before launch. - Vision & image support — Base64 and URL images are passed through to vision-capable models.
- Sub-agents & memory — AgentTool spawns sub-agents using the same configured provider; a persistent memory system is included.
- Slash commands — /commit, /review, /compact, /diff, /doctor, and more are all available.
Community Discussions
Be the first to start a conversation about OpenClaude
Share your experience with OpenClaude, ask questions, or help others learn from your insights.
Pricing
Open Source
Fully free and open-source; install via npm or build from source.
- All Claude Code tools
- OpenAI-compatible provider shim
- Local model support
- Provider launch profiles
- Runtime hardening tools
Capabilities
Key Features
- OpenAI-compatible provider shim
- Support for GPT-4o, DeepSeek, Gemini, Llama, Mistral, and 200+ models
- Full Claude Code tool support (Bash, FileRead, FileWrite, FileEdit, Glob, Grep, WebFetch, WebSearch, Agent, MCP, LSP, NotebookEdit, Tasks)
- Real-time token streaming
- Multi-step tool chains
- Local model support via Ollama and LM Studio
- ChatGPT Codex backend support (codexplan, codexspark)
- Provider launch profiles
- Runtime hardening and diagnostics (doctor:runtime, smoke, hardening:check)
- Vision and image support (Base64 and URL)
- Sub-agents via AgentTool
- Persistent memory system
- Slash commands (/commit, /review, /compact, /diff, /doctor)
- npm and Bun install options
- Azure OpenAI support
