Skip to content

Bug: openai-compatible deepseek/glm/minimax models lost reasoning variants after qwen exclusion #23334

@elonazoulay

Description

@elonazoulay

Description

ProviderTransform.variants() should return the standard low / medium / high reasoning variants for reasoning-capable deepseek, glm, and minimax models when they are routed through @ai-sdk/openai-compatible.

On current dev, those model families are being excluded by the same blanket guard that also excludes qwen, kimi, and related models, so they incorrectly return {} instead.

This issue is intentionally narrow: qwen, kimi, and k2p5 should remain excluded. This is a follow-up to PR #21212, not a blanket revert.

Plugins

OpenCode version

1.4.11

Steps to reproduce

  1. Inspect packages/opencode/src/provider/transform.ts on current dev.
  2. Call ProviderTransform.variants() with a reasoning-capable deepseek, glm, or minimax model using @ai-sdk/openai-compatible.
  3. Observe that it returns {} instead of low / medium / high.

Screenshot and/or share link

Operating System

macOS

Terminal

zsh

Metadata

Metadata

Assignees

Labels

coreAnything pertaining to core functionality of the application (opencode server stuff)

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions