We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
meta/meta-llama-3-70b
max_tokens
I'm pretty sure I'm sending max_tokens and:
When I use exactly the same code for e.g. meta/llama-2-70b this does not happen, i.e. I really get the requested number of tokens.
meta/llama-2-70b
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I'm pretty sure I'm sending
max_tokens
and:max_tokens
when looking at my prediction in the browserWhen I use exactly the same code for e.g.
meta/llama-2-70b
this does not happen, i.e. I really get the requested number of tokens.The text was updated successfully, but these errors were encountered: