-
Notifications
You must be signed in to change notification settings - Fork 986
Issue 1467 Fix #1523
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Issue 1467 Fix #1523
Conversation
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
PR Code Suggestions ✨Explore these optional code suggestions:
|
|||||||||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested ❌
Reviewed everything up to 8bd93ae in 58 seconds. Click for details.
- Reviewed
24lines of code in1files - Skipped
0files when reviewing. - Skipped posting
1draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. src/agents-api/agents_api/clients/litellm.py:140
- Draft comment:
Normalization is applied unconditionally in aembedding now. Consider aligning the handling logic with acompletion (perhaps via a shared utility) to prevent potential inconsistencies in future changes. - Reason this comment was not posted:
Marked as duplicate.
Workflow ID: wflow_KPakXrkFTTnu106G
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
|
|
||
| custom_api_key = secret and secret.value | ||
| model = f"openai/{model}" # This is needed for litellm | ||
| model = f"openai/{model.removeprefix('openai/')}" # This is needed for litellm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider extracting the model normalization (using removeprefix) into a helper function to avoid duplication and ensure consistent behavior across functions. Also, ensure that your minimum Python version supports str.removeprefix (Python 3.9+).
User description
🔧 fix: normalize
openai/model prefix inaembeddingandacompletionSummary:
Fixes inconsistent and conditional handling of
openai/prefix in model names. Previously,aembeddingonly applied the prefix whencustom_api_keywas not set, leading to malformed model names likeopenai/openai/...and broken usage tracking.Changes:
aembeddingandacompletion:PR Type
Bug fix
Description
Fix inconsistent
openai/prefix handling in model namesNormalize model prefix in both
acompletionandaembeddingfunctionsPrevent malformed model names like
openai/openai/...Remove conditional prefix application in
aembeddingChanges diagram
flowchart LR A["Model Name Input"] --> B["removeprefix('openai/')"] B --> C["Add 'openai/' prefix"] C --> D["Normalized Model Name"]Changes walkthrough 📝
litellm.py
Normalize openai model prefix handlingsrc/agents-api/agents_api/clients/litellm.py
acompletionfunction usingremoveprefixaembeddingfunction