Skip to content

Conversation

s44002
Copy link

@s44002 s44002 commented Mar 12, 2025

Fixes #71

This PR adds support for max_tokens in ModelSettings. This was previously missing, causing issues with models that require an explicit max_tokens value.

@rm-openai
Copy link
Collaborator

Apologies, I didn't see this issue/PR in time and implemented it myself via #105

@rm-openai rm-openai closed this Mar 12, 2025
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

max_tokens is not an accepted parameter
2 participants