LogoAI Just Better
icon of OpenClaude

OpenClaude

Claude Code opened to any LLM — OpenAI, Gemini, DeepSeek, Ollama, and 200+ models via OpenAI-compatible API

Introduction

OpenClaude: Unleash the Power of Any LLM with Claude Code

OpenClaude is a groundbreaking project that transforms the capabilities of Claude Code, a powerful AI-powered coding assistant, by enabling it to work seamlessly with a vast array of Large Language Models (LLMs). Originally forked from the Claude Code source leak (exposed via npm source maps on March 31, 2026), OpenClaude introduces an OpenAI-compatible provider shim, allowing users to leverage models like GPT-4o, DeepSeek, Gemini, Llama, Mistral, and over 200 other models that adhere to the OpenAI chat completions API. This innovative approach democratizes advanced AI coding assistance, making it accessible beyond the confines of a single proprietary model.

Core Functionality and Features:

OpenClaude retains and enhances the core functionalities of Claude Code, ensuring that users benefit from a robust suite of AI-powered development tools. These include:

  • Universal LLM Compatibility: The primary innovation is the OpenAI-compatible provider shim, which acts as a universal translator between Claude Code's internal SDK interface and various LLM APIs. This means users can switch between different LLMs without altering their workflow.
  • All Claude Code Tools: OpenClaude fully supports all the tools originally developed for Claude Code, including:
    • Bash: Execute shell commands directly.
    • File Operations: Read, write, and edit files within the project context.
    • Glob and Grep: Powerful file searching and pattern matching capabilities.
    • Web Fetch and Search: Integrate with web APIs for data retrieval and information gathering.
    • Agent and MCP: Advanced agentic capabilities for complex task execution and management.
    • LSP (Language Server Protocol): Enhanced code intelligence and autocompletion.
    • NotebookEdit: Seamless integration with notebook environments.
    • Tasks: Manage and execute project-specific tasks.
  • Streaming Support: Real-time token streaming provides immediate feedback, mimicking the interactive experience of direct LLM interaction.
  • Tool Calling: OpenClaude supports multi-step tool chains, allowing the LLM to intelligently call tools, process their results, and continue its reasoning process for complex problem-solving.
  • Image Integration: The system can handle Base64 encoded and URL-based images, passing them to vision-capable models for analysis and processing.
  • Slash Commands: Familiar slash commands like /commit, /review, /compact, /diff, and /doctor are fully functional, streamlining common development actions.
  • Sub-Agents: The AgentTool can spawn sub-agents that utilize the same LLM provider, enabling hierarchical task delegation and parallel processing.
  • Persistent Memory: A robust memory system ensures context is maintained across interactions, improving the coherence and effectiveness of AI assistance.
  • Codex Backend Support: For users familiar with the Codex CLI, OpenClaude supports the ChatGPT Codex backend, specifically for codexplan (high reasoning) and codexspark (faster loops) functionalities.
Installation and Setup:

OpenClaude offers flexible installation options to suit different user preferences:

Option A: npm (Recommended)

npm install -g @gitlawb/openclaude

Option B: From Source (Requires Bun)

# Clone from gitlawb
git clone https://node.gitlawb.com/z6MkqDnb7Siv3Cwj7pGJq4T5EsUisECqR8KpnDLwcaZq5TPr/openclaude.git
cd openclaude
 
# Install dependencies
bun install
 
# Build
bun run build
 
# Link globally (optional)
npm link

Option C: Run directly with Bun (no build step)

git clone https://node.gitlawb.com/z6MkqDnb7Siv3Cwj7pGJq4T5EsUisECqR8KpnDLwcaZq5TPr/openclaude.git
cd openclaude
bun install
bun run dev

Environment Variables: To configure OpenClaude, set the following environment variables:

  • CLAUDE_CODE_USE_OPENAI=1: Enables the OpenAI provider.
  • OPENAI_API_KEY: Your API key (required for non-local models).
  • OPENAI_MODEL: The specific model to use (e.g., gpt-4o, deepseek-chat, llama3.3:70b).
  • OPENAI_BASE_URL: The API endpoint (defaults to OpenAI's URL, but can be set for custom endpoints like Ollama or OpenRouter).
  • CODEX_API_KEY / CODEX_AUTH_JSON_PATH / CODEX_HOME: For Codex backend integration.
  • OPENCLAUDE_DISABLE_CO_AUTHORED_BY=1: To suppress the default co-authored-by trailer in commit messages.

Provider Launch Profiles: OpenClaude simplifies setup with provider-specific launch profiles:

  • Initialization: bun run profile:init (auto-selects best local/OpenAI), bun run profile:codex, bun run profile:init -- --provider ollama --model llama3.1:8b.
  • Running: bun run dev:profile, bun run dev:openai, bun run dev:ollama, bun run dev:codex.
Model Quality and Recommendations:

OpenClaude's performance is highly dependent on the chosen LLM's capabilities, particularly its tool-calling proficiency. The project provides a guide to model quality:

  • Excellent Tool Calling & Code Quality: GPT-4o, DeepSeek-V3, Gemini 2.0 Flash.
  • Good Tool Calling & Code Quality: Llama 3.3 70B, Mistral Large, GPT-4o-mini, Qwen 2.5 72B.
  • Limited Tool Calling & Code Quality: Smaller models (<7B).

For optimal results, models with strong function/tool calling support are recommended. Local models like Ollama and LM Studio are supported, with automatic detection and configuration options.

Runtime Hardening:

To ensure stability and catch configuration errors early, OpenClaude includes runtime hardening checks:

  • bun run smoke: Basic startup check.
  • bun run doctor:runtime: Validates environment variables and provider connectivity.
  • bun run doctor:report: Generates a diagnostics report.
  • bun run hardening:check / hardening:strict: Comprehensive local and project-wide checks.
Origin and Licensing:

OpenClaude is a fork of the instructkr/claude-code repository, which was derived from a Claude Code source snapshot. The original Claude Code is the property of Anthropic. OpenClaude's OpenAI shim additions are in the public domain, provided for educational and research purposes. It is not affiliated with or endorsed by Anthropic.

Key Differentiators:
  • Flexibility: Use any OpenAI-compatible LLM, not just Anthropic's models.
  • Power: Retains all the advanced tools and features of the original Claude Code.
  • Accessibility: Makes sophisticated AI coding assistance available to a wider range of users and models.
  • Customization: Extensive environment variable support and profile management for tailored setups.

OpenClaude represents a significant step forward in making AI-powered coding tools more versatile and accessible, empowering developers to choose the best LLM for their specific needs and workflows.

Share with those who may need it
Logo

Also got a product to promote?

Get high DR (50+) backlinks from us to boost your SEO and reach your target audience. Start for free.

AI One-click Submit
icon of Nano Banana Pro

Nano Banana 2

AD

Free AI image generator powered by Google Gemini 3.1 Flash. Create stunning AI art with pre-built styles.

Share with those who may need it

Information

Newsletter

Join the Community

Subscribe to our newsletter for the latest news and updates