Skip to content

feat: add Claude Code CLI as LLM backend#460

Open
themoddedcube wants to merge 2 commits intoalgorithmicsuperintelligence:mainfrom
themoddedcube:feat/claude-code-llm
Open

feat: add Claude Code CLI as LLM backend#460
themoddedcube wants to merge 2 commits intoalgorithmicsuperintelligence:mainfrom
themoddedcube:feat/claude-code-llm

Conversation

@themoddedcube
Copy link
Copy Markdown

Summary

  • Adds ClaudeCodeLLM, a new LLM provider that uses the Claude Code CLI (claude -p) as a subprocess for generation
  • No API keys needed — authentication uses the CLI's existing OAuth session (claude login)
  • Adds a provider field to LLMModelConfig and a provider registry in LLMEnsemble for automatic backend selection
  • Includes 9 unit tests covering initialization, CLI invocation, system message passing, error handling, and provider registry integration

Usage

Config-based (no code changes needed):

llm:
  provider: "claude_code"
  models:
    - name: "sonnet"
      weight: 1.0
      max_tokens: 16000
      timeout: 300

Programmatic (for custom setups):

from openevolve.llm.claude_code import init_claude_code_client

for model_cfg in config.llm.models:
    model_cfg.init_client = init_claude_code_client

Files changed

File Change
openevolve/llm/claude_code.py New ClaudeCodeLLM class + init_claude_code_client factory
openevolve/llm/__init__.py Export new classes
openevolve/llm/ensemble.py Add provider registry + _create_model() dispatch
openevolve/config.py Add provider field to LLMModelConfig
tests/test_claude_code_llm.py 9 unit tests

Test plan

  • All 9 new unit tests pass
  • All 376 existing tests still pass (pre-existing failures unrelated: missing requests module, integration tests needing API keys)
  • Manual verification with claude CLI installed and authenticated

🤖 Generated with Claude Code

Adds ClaudeCodeLLM, a new LLM provider that uses the Claude Code CLI
(`claude -p`) for generation. This enables OpenEvolve to use Anthropic's
Claude models without requiring direct API keys — authentication is
handled by the CLI's existing OAuth session.

Users can configure it via `provider: "claude_code"` in config.yaml or
by injecting `init_claude_code_client` programmatically. Includes a
provider registry in LLMEnsemble for automatic backend selection.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@CLAassistant
Copy link
Copy Markdown

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

…laude Code backend

Add retry/retry_delay support to ClaudeCodeLLM mirroring the OpenAI
backend pattern. Expose max_budget_usd in LLMModelConfig so it can be
set via YAML. Add claude_code_quickstart example and document the
Claude Code CLI provider in the README.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants