-
Notifications
You must be signed in to change notification settings - Fork 2.9k
feat: migrate OpenAiHandler to AI SDK #11351
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Replace direct OpenAI SDK usage in the OpenAI Compatible provider with Vercel AI SDK, following the same pattern as already-migrated providers. Provider strategy: - Standard endpoints: @ai-sdk/openai (createOpenAI) with .chat() - Azure AI Inference: @ai-sdk/openai-compatible (createOpenAICompatible) with adjusted baseURL (/models) and queryParams for api-version - Azure OpenAI: @ai-sdk/azure (createAzure) with useDeploymentBasedUrls Key changes: - Streaming uses streamText() + processAiSdkStreamPart() - Non-streaming uses generateText() - O3/O1/O4 family: providerOptions.openai.systemMessageMode='developer' and reasoningEffort - DeepSeek R1 format: system prompt prepended as user message - TagMatcher retained for <think> tag extraction in streaming text - reasoning_content handled natively by AI SDK providers - isAiSdkProvider() returns true - getOpenAiModels() standalone function unchanged Tests updated across all 4 test files (58 tests passing).
Both previously flagged issues are now resolved. No new issues found.
Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues. |
Fixed the reported issues. |
Summary
Migrates the
OpenAiHandler(the "OpenAI Compatible" provider insrc/api/providers/openai.ts) from directopenainpm package usage to the Vercel AI SDK, following the same pattern as already-migrated providers like DeepSeek.Provider Strategy
@ai-sdk/openai(createOpenAI).chat(modelId).services.ai.azure.com)@ai-sdk/openai-compatible(createOpenAICompatible)queryParamsfor api-version.openai.azure.com)@ai-sdk/azure(createAzure)useDeploymentBasedUrls: trueKey Changes
streamText()+processAiSdkStreamPart()(replacesclient.chat.completions.createwith stream)generateText()(replacesclient.chat.completions.createwithout stream)providerOptions.openai.systemMessageMode: 'developer'andreasoningEffort<think>tag matching:TagMatcherretained for extracting reasoning from text contentreasoning-deltaeventsisAiSdkProvider(): ReturnstruegetOpenAiModels(): Standalone function unchanged (still uses axios)Behavioral Notes
cache_controlmarkers is no longer injected into messages. AI SDK providers handle caching at the provider level where supported.openainpm package'stimeoutoption._isGrokXAImethod was removed asstream_optionshandling is managed internally by the AI SDK.Test Coverage
All 4 test files updated to mock AI SDK instead of the
openainpm package:openai.spec.ts— 44 testsopenai-timeout.spec.ts— 5 testsopenai-usage-tracking.spec.ts— 3 testsopenai-native-tools.spec.ts— 6 tests (1 test updated for new mocking pattern)58 tests passing, zero regressions across 853 provider tests and 5370 total tests.
Important
Migrates
OpenAiHandlerto Vercel AI SDK, updating streaming and non-streaming handling, and adjusts tests to mock AI SDK.OpenAiHandlerinopenai.tsto use Vercel AI SDK instead ofopenainpm package.streamText()andprocessAiSdkStreamPart().generateText().systemMessageMode: 'developer'andreasoningEffort.TagMatcherused for<think>tag matching.reasoning_contenthandled viareasoning-deltaevents.isAiSdkProvider()returnstrue.getOpenAiModels()unchanged.openai.spec.ts,openai-timeout.spec.ts,openai-usage-tracking.spec.ts,openai-native-tools.spec.ts._isGrokXAImethod.This description was created by
for 8372f41. You can customize this summary. It will automatically update as commits are pushed.