-
Notifications
You must be signed in to change notification settings - Fork 13
🤖 feat: add OpenRouter provider support #550
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Add official OpenRouter provider integration for access to 300+ models through a single API. Fixes errors that occurred when using baseURL override approach. Changes: - Install @openrouter/ai-sdk-provider package - Add OpenRouter to aiService.ts createModel method - Add OpenRouterProviderOptions to provider types - Add OPENROUTER_API_KEY environment variable support - Update docs/models.md with OpenRouter setup guide OpenRouter provides: - Universal model access (Anthropic, OpenAI, Google, Cerebras, etc.) - Pay-as-you-go pricing with transparent per-token costs - High availability with automatic failover - Immediate access to new models Usage: openrouter:anthropic/claude-3.5-sonnet openrouter:google/gemini-2.0-flash-thinking-exp openrouter:cerebras/glm-4.6 openrouter:deepseek/deepseek-chat _Generated with `cmux`_
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Lines 1121 to 1125 in 0598fc9
| ipcMain.handle(IPC_CHANNELS.PROVIDERS_LIST, () => { | |
| try { | |
| // Return all supported providers, not just configured ones | |
| // This matches the providers defined in the registry | |
| return ["anthropic", "openai"]; |
The new OpenRouter integration never shows up in the providers UI because PROVIDERS_LIST still returns only "anthropic" and "openai". The renderer calls this handler to know which providers can be configured (src/browser/api.ts and src/preload.ts), so users will not see or be able to configure OpenRouter credentials even though the backend now supports it. Please include "openrouter" (and any other supported providers) in this list.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
Enable transparent pass-through of OpenRouter provider routing options
from providers.jsonc to control which infrastructure providers serve
requests (Cerebras, Fireworks, Together, etc.).
Changes:
- Update OpenRouterProviderOptions documentation
- Add OpenRouter case to buildProviderOptions
- Document provider routing in docs/models.md with examples
- Add GLM-4.6 example (z-ai/glm-4.6, not cerebras/glm-4.6)
Usage:
```jsonc
{
"openrouter": {
"apiKey": "sk-or-v1-...",
"provider": {
"order": ["Cerebras", "Fireworks"],
"allow_fallbacks": true
}
}
}
```
The ProviderConfig already supports arbitrary properties via
`[key: string]: unknown`, so OpenRouter options pass through
transparently to the SDK's extraBody parameter.
_Generated with `cmux`_
Map provider routing options from providers.jsonc to OpenRouter's
extraBody parameter. The SDK expects standard options (apiKey, baseURL,
headers, fetch) at the top level and everything else in extraBody.
Before: Spread entire providerConfig (provider routing ignored)
After: Extract standard fields, pass rest via extraBody
This enables provider routing to actually work:
```jsonc
{
"openrouter": {
"apiKey": "sk-or-v1-...",
"provider": {
"require": "Cerebras"
}
}
}
```
_Generated with `cmux`_
The documentation incorrectly showed:
"provider": { "require": "Cerebras" }
OpenRouter's API doesn't have a 'require' field. The correct format is:
"provider": { "order": ["Cerebras"], "allow_fallbacks": false }
Changes:
- Fixed example to use correct 'order' + 'allow_fallbacks' fields
- Added comprehensive list of all provider routing options
- Added link to official OpenRouter provider routing docs
_Generated with `cmux`_
Enable thinking levels for OpenRouter reasoning models (Claude Sonnet Thinking, etc.) by passing reasoning.effort through providerOptions. OpenRouter supports two reasoning control methods: 1. reasoning.effort: 'low'|'medium'|'high' (maps to our thinking levels) 2. reasoning.max_tokens: number (token budget) We use effort-based control which maps cleanly to our existing thinking level UI (off/low/medium/high). Changes: - Added OPENROUTER_REASONING_EFFORT mapping in thinking.ts - Updated buildProviderOptions to pass reasoning config when thinking > off - Added OpenRouterReasoningOptions type for type safety - Set exclude: false to show reasoning traces in UI This complements factory-level provider routing (configured in providers.jsonc) with per-request reasoning control (based on thinking slider). _Generated with `cmux`_
Added section explaining how to use the thinking slider with OpenRouter reasoning models. The thinking level controls reasoning.effort (low, medium, high) which works with Claude Sonnet Thinking and other reasoning-capable models via OpenRouter. _Generated with `cmux`_
Ran scripts/update_models.ts to pull latest model data from LiteLLM. Added Z.AI GLM-4.6 to models-extra.ts with OpenRouter pricing: - 200K context window (202,752 tokens) - $0.40/M input, $1.75/M output - Supports tool use, reasoning, and structured outputs This fixes model stat lookups for: - openrouter:z-ai/glm-4.6 - openrouter:anthropic/claude-3.7-sonnet:thinking (already in models.json) Changes: - Updated src/utils/tokens/models.json (3,379 additions from LiteLLM) - Added openrouter/z-ai/glm-4.6 to models-extra.ts _Generated with `cmux`_
Add official OpenRouter provider integration for access to 300+ models through a single API. Fixes errors that occurred when using baseURL override approach.
Changes
@openrouter/ai-sdk-providerpackageaiService.tscreateModel methodOpenRouterProviderOptionsto provider typesOPENROUTER_API_KEYenvironment variable supportdocs/models.mdwith OpenRouter setup guideBenefits
OpenRouter provides:
Usage
Model format:
openrouter:anthropic/claude-3.5-sonnetopenrouter:google/gemini-2.0-flash-thinking-expopenrouter:cerebras/glm-4.6openrouter:deepseek/deepseek-chatTesting
Why Not BaseURL Override?
The baseURL override approach has several issues:
The official provider handles all edge cases automatically.
Generated with
cmux