-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Streaming Fails Due to {"include_usage": True}
#442
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
Proposed fix makes sense, PR welcome! |
@rm-openai I created a PR, pls review. |
rm-openai
pushed a commit
that referenced
this issue
Apr 10, 2025
fix issue #442 below is an example to overwrite include_usage ``` result = Runner.run_streamed( agent, "Write a haiku about recursion in programming.", run_config=RunConfig( model_provider=CUSTOM_MODEL_PROVIDER, model_settings=ModelSettings(include_usage=True) ), ) ```
Lightblues
pushed a commit
to Lightblues/openai-agents-python
that referenced
this issue
Apr 13, 2025
fix issue openai#442 below is an example to overwrite include_usage ``` result = Runner.run_streamed( agent, "Write a haiku about recursion in programming.", run_config=RunConfig( model_provider=CUSTOM_MODEL_PROVIDER, model_settings=ModelSettings(include_usage=True) ), ) ```
# for free
to join this conversation on GitHub.
Already have an account?
# to comment
Description
When using Mistral AI's OpenAI-compatible API with streaming output
Runner.run_streamed
, the hardcodedstream_options={"include_usage": True}
in the code triggers a422 Unprocessable Entity
error. Mistral's API does not support theinclude_usage
parameter instream_options
, causing compatibility issues.Root Cause:
openai-agents-python/src/agents/models/openai_chatcompletions.py
Line 539 in 064e25b
Proposed Fix
Add an override mechanism for
stream_options
Error
The text was updated successfully, but these errors were encountered: