We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
No description provided.
The text was updated successfully, but these errors were encountered:
Possibly with the server example: https://github.com/ggerganov/llama.cpp/tree/master/examples/server
server
You would need a script or something to manage the queries and collect the result.
Sorry, something went wrong.
Possibly with the server example: https://github.com/ggerganov/llama.cpp/tree/master/examples/server You would need a script or something to manage the queries and collect the result.
i have the same question.
I have multiple prompts, I want to feed them all at once to the model to generate the outputs,can you tell me how to achieve it ?
No branches or pull requests
No description provided.
The text was updated successfully, but these errors were encountered: