feat: add MiniMax as a new LLM provider#11367
feat: add MiniMax as a new LLM provider#11367octo-patch wants to merge 1 commit intocontinuedev:mainfrom
Conversation
Add MiniMax (https://platform.minimax.io) as a new LLM provider with OpenAI-compatible API support. Changes: - Add MiniMax LLM provider class extending OpenAI with temperature clamping (must be in (0, 1]) and response_format removal - Register provider in LLMClasses, openai-adapters, and config-types - Add model info for MiniMax-M2.5 and MiniMax-M2.5-highspeed (204K context, 192K max output) - Add GUI model selection entries and provider configuration - Add provider documentation page
|
I have read the CLA Document and I hereby sign the CLA PR Bot seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. |
There was a problem hiding this comment.
1 issue found across 10 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/openai-adapters/src/index.ts">
<violation number="1" location="packages/openai-adapters/src/index.ts:145">
P1: MiniMax is wired to generic `OpenAIApi`, which skips the repo’s MiniMax-specific request fixes (temperature clamping and `response_format` removal), creating a real incompatibility path in adapter-based runtime flows.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| case "groq": | ||
| return openAICompatible("https://api.groq.com/openai/v1/", config); | ||
| case "minimax": | ||
| return openAICompatible("https://api.minimax.io/v1/", config); |
There was a problem hiding this comment.
P1: MiniMax is wired to generic OpenAIApi, which skips the repo’s MiniMax-specific request fixes (temperature clamping and response_format removal), creating a real incompatibility path in adapter-based runtime flows.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/openai-adapters/src/index.ts, line 145:
<comment>MiniMax is wired to generic `OpenAIApi`, which skips the repo’s MiniMax-specific request fixes (temperature clamping and `response_format` removal), creating a real incompatibility path in adapter-based runtime flows.</comment>
<file context>
@@ -141,6 +141,8 @@ export function constructLlmApi(config: LLMConfig): BaseLlmApi | undefined {
case "groq":
return openAICompatible("https://api.groq.com/openai/v1/", config);
+ case "minimax":
+ return openAICompatible("https://api.minimax.io/v1/", config);
case "sambanova":
return openAICompatible("https://api.sambanova.ai/v1/", config);
</file context>
Summary
Changes
New Files
core/llm/llms/MiniMax.ts— Provider class extending OpenAI with:(0, 1]range (MiniMax rejects0)response_formatremoval (unsupported)https://api.minimax.io/v1/packages/llm-info/src/providers/minimax.ts— Model metadatadocs/customize/model-providers/more/minimax.mdx— Provider documentationModified Files
core/llm/llms/index.ts— Register in LLMClassespackages/openai-adapters/src/index.ts— Register OpenAI-compatible adapterpackages/config-types/src/index.ts— Add"minimax"to provider enumpackages/llm-info/src/index.ts— Add to allModelProvidersgui/src/pages/AddNewModel/configs/models.ts— Model entriesgui/src/pages/AddNewModel/configs/providers.ts— Provider config with API key setupdocs/customize/model-providers/overview.mdx— Add to hosted services tableModels
Test Plan
Summary by cubic
Add MiniMax as an OpenAI-compatible LLM provider with GUI support and docs. Includes
MiniMax-M2.5andMiniMax-M2.5-highspeedmodels (204K context) and normalizes unsupported params.New Features
minimaxprovider usinghttps://api.minimax.io/v1/via the OpenAI-compatible adapter.LLMClasses, config types, andllm-info.MiniMax-M2.5andMiniMax-M2.5-highspeed.response_format.Migration
provider: minimaxwith your API key, or setMINIMAX_API_KEY.apiBasetohttps://api.minimaxi.com/v1/.Written for commit a15305e. Summary will update on new commits.