Skip to content

Add MiniMax as alternative LLM provider in research agent notebook#595

Open
octo-patch wants to merge 1 commit intosuperlinked:mainfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax as alternative LLM provider in research agent notebook#595
octo-patch wants to merge 1 commit intosuperlinked:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

The research_ai_agent.ipynb notebook was hardcoded to use OpenAI (GPT-4). This PR adds MiniMax as a drop-in alternative provider, switchable via an environment variable.

MiniMax exposes an OpenAI-compatible REST API, so no extra dependencies are needed — the existing openai SDK works with just a different base_url and API key.

Changes

  • docs/assets/use_cases/research_agent/research_ai_agent.ipynb
    • Replace hardcoded OpenAI-only setup with a two-provider config block (PROVIDER = os.environ.get("LLM_PROVIDER", "openai"))
    • Safe secret retrieval that works in both Google Colab and local environments
    • Default model for MiniMax: MiniMax-M2.7 with a 204K-token context window
    • Update the setup markdown cell with a provider comparison table

Usage

OpenAI (default — no change needed):

OPENAI_API_KEY=sk-...  # as before

MiniMax:

LLM_PROVIDER=minimax
MINIMAX_API_KEY=your-key   # from https://www.minimaxi.com/

In Colab, add the relevant key via Tools → Secrets and set LLM_PROVIDER in an early cell or as an environment variable.

Why MiniMax?

  • OpenAI-compatible API — zero extra dependencies
  • MiniMax-M2.7 has a 204K-token context window, useful when ingesting large research paper corpora
  • Provides an alternative for users in regions or organizations where OpenAI access is restricted

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant