Skip to content

Conversation

@daniel-lxs
Copy link
Member

@daniel-lxs daniel-lxs commented Feb 9, 2026

Summary

Migrates the OpenAiHandler (the "OpenAI Compatible" provider in src/api/providers/openai.ts) from direct openai npm package usage to the Vercel AI SDK, following the same pattern as already-migrated providers like DeepSeek.

Provider Strategy

Endpoint Type AI SDK Package Method
Standard OpenAI-compatible @ai-sdk/openai (createOpenAI) .chat(modelId)
Azure AI Inference (.services.ai.azure.com) @ai-sdk/openai-compatible (createOpenAICompatible) Adjusted baseURL + queryParams for api-version
Azure OpenAI (.openai.azure.com) @ai-sdk/azure (createAzure) useDeploymentBasedUrls: true

Key Changes

  • Streaming: Uses streamText() + processAiSdkStreamPart() (replaces client.chat.completions.create with stream)
  • Non-streaming: Uses generateText() (replaces client.chat.completions.create without stream)
  • O3/O1/O4 family: Uses providerOptions.openai.systemMessageMode: 'developer' and reasoningEffort
  • DeepSeek R1 format: System prompt prepended as user message when R1 mode is enabled
  • <think> tag matching: TagMatcher retained for extracting reasoning from text content
  • reasoning_content: Handled natively by AI SDK providers via reasoning-delta events
  • isAiSdkProvider(): Returns true
  • getOpenAiModels(): Standalone function unchanged (still uses axios)

Behavioral Notes

  • Prompt caching via cache_control markers is no longer injected into messages. AI SDK providers handle caching at the provider level where supported.
  • Timeout configuration is now managed by the AI SDK instead of the openai npm package's timeout option.
  • The _isGrokXAI method was removed as stream_options handling is managed internally by the AI SDK.

Test Coverage

All 4 test files updated to mock AI SDK instead of the openai npm package:

  • openai.spec.ts — 44 tests
  • openai-timeout.spec.ts — 5 tests
  • openai-usage-tracking.spec.ts — 3 tests
  • openai-native-tools.spec.ts — 6 tests (1 test updated for new mocking pattern)

58 tests passing, zero regressions across 853 provider tests and 5370 total tests.


Important

Migrates OpenAiHandler to Vercel AI SDK, updating streaming and non-streaming handling, and adjusts tests to mock AI SDK.

  • Behavior:
    • Migrates OpenAiHandler in openai.ts to use Vercel AI SDK instead of openai npm package.
    • Streaming handled by streamText() and processAiSdkStreamPart().
    • Non-streaming handled by generateText().
    • O3 family models use systemMessageMode: 'developer' and reasoningEffort.
    • DeepSeek R1 format prepends system prompt as user message.
    • TagMatcher used for <think> tag matching.
    • reasoning_content handled via reasoning-delta events.
    • isAiSdkProvider() returns true.
    • getOpenAiModels() unchanged.
  • Tests:
    • Updated 4 test files to mock AI SDK: openai.spec.ts, openai-timeout.spec.ts, openai-usage-tracking.spec.ts, openai-native-tools.spec.ts.
    • Total of 58 tests passing with no regressions.
  • Misc:
    • Removed _isGrokXAI method.
    • Timeout and caching managed by AI SDK.

This description was created by Ellipsis for 8372f41. You can customize this summary. It will automatically update as commits are pushed.

Replace direct OpenAI SDK usage in the OpenAI Compatible provider with
Vercel AI SDK, following the same pattern as already-migrated providers.

Provider strategy:
- Standard endpoints: @ai-sdk/openai (createOpenAI) with .chat()
- Azure AI Inference: @ai-sdk/openai-compatible (createOpenAICompatible)
  with adjusted baseURL (/models) and queryParams for api-version
- Azure OpenAI: @ai-sdk/azure (createAzure) with useDeploymentBasedUrls

Key changes:
- Streaming uses streamText() + processAiSdkStreamPart()
- Non-streaming uses generateText()
- O3/O1/O4 family: providerOptions.openai.systemMessageMode='developer'
  and reasoningEffort
- DeepSeek R1 format: system prompt prepended as user message
- TagMatcher retained for <think> tag extraction in streaming text
- reasoning_content handled natively by AI SDK providers
- isAiSdkProvider() returns true
- getOpenAiModels() standalone function unchanged

Tests updated across all 4 test files (58 tests passing).
@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. Enhancement New feature or request labels Feb 9, 2026
@roomote
Copy link
Contributor

roomote bot commented Feb 9, 2026

Rooviewer Clock   See task

Both previously flagged issues are now resolved. No new issues found.

  • reasoningTokens silently dropped from processUsageMetrics -- the field is typed in the parameter but not included in the output (comment)
  • providerMetadata not used in streaming or non-streaming paths, losing any OpenAI-specific metadata like detailed token breakdowns (comment)
Previous reviews

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

@roomote
Copy link
Contributor

roomote bot commented Feb 9, 2026

Fixaroo Clock   See task

Fixed the reported issues. processUsageMetrics now includes reasoningTokens in the output and accepts providerMetadata to extract OpenAI-specific cache/reasoning token breakdowns in both streaming and non-streaming paths. All 60 tests pass.

View commit | Revert commit

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Feb 9, 2026
@daniel-lxs daniel-lxs merged commit a4914c4 into main Feb 9, 2026
13 checks passed
@daniel-lxs daniel-lxs deleted the migrate-openai-to-ai-sdk branch February 9, 2026 23:49
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Feb 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Enhancement New feature or request lgtm This PR has been approved by a maintainer size:XXL This PR changes 1000+ lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants