-
Notifications
You must be signed in to change notification settings - Fork 60.6k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
fix: remove inappropriate parameters for o3 from the request #6131
Conversation
@MonadMonAmi is attempting to deploy a commit to the NextChat Team on Vercel. A member of the Team first needs to authorize it. |
WalkthroughThe changes expand the Changes
Sequence Diagram(s)sequenceDiagram
participant C as Client
participant API as ChatGPTApi
participant SRV as OpenAI Server
C->>API: chat(options)
API->>API: Check if model starts with "o1" or "o3" (isO1/isO3)
API->>API: Filter out "system" messages if isO1 or isO3
API->>API: Set requestPayload parameters (temperature, penalties, top_p, max_completion_tokens)
API->>SRV: Send requestPayload
SRV-->>API: Return response
API-->>C: Deliver chat response
Poem
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
app/client/platforms/openai.ts (1)
222-223
: Update comment to include O3 limitations.The code comment at line 226 only mentions O1 limitations, but the code implies O3 has similar limitations. Consider updating the comment to include O3.
-// O1 not support image, tools (plugin in ChatGPTNextWeb) and system, stream, logprobs, temperature, top_p, n, presence_penalty, frequency_penalty yet. +// O1 and O3 do not support image, tools (plugin in ChatGPTNextWeb) and system, stream, logprobs, temperature, top_p, n, presence_penalty, frequency_penalty yet.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
app/client/platforms/openai.ts
(2 hunks)
🔇 Additional comments (3)
app/client/platforms/openai.ts (3)
198-200
: LGTM! Clear and consistent model type checks.The new model type checks follow existing naming conventions and are implemented logically.
229-234
: Verify streaming support for O3.The code allows streaming for O3 but disables it for O1. Please verify if this is the intended behavior as both models seem to share other limitations.
#!/bin/bash # Description: Search for any documentation or tests that confirm O3's streaming capability rg -A 5 "o3.*stream|stream.*o3"
239-242
: Update comment and verify token control for O3.The comment only mentions O1's token control mechanism, but the code applies it to O3 as well. Consider:
- Updating the comment to include O3
- Verifying if O3 uses the same token control mechanism
-// O1 使用 max_completion_tokens 控制token数 (https://platform.openai.com/docs/guides/reasoning#controlling-costs) +// O1 and O3 use max_completion_tokens to control token count (https://platform.openai.com/docs/guides/reasoning#controlling-costs)#!/bin/bash # Description: Search for documentation about O3's token control mechanism rg -A 5 "o3.*token|token.*o3|max_completion_tokens.*o3"
fix #6132 |
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
📝 补充信息 | Additional Information
Summary by CodeRabbit