Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Add temperature support to vLLM #645

Merged
merged 1 commit into from
Sep 17, 2024
Merged

Conversation

AmitSinghShorthillsAI
Copy link
Contributor

Added temperature support for hosted LLM using vLLM

Changes made:

  • Introduced default temperature of 0.7 in init method
  • Updated JSON payload in submit_prompt method to include temperature

This change allows users to control the randomness of the model's output. If not specified, it defaults to 0.7, providing a balance between creativity and coherence in the generated text.

Added temperature support for hosted LLM using vLLM

Changes made:
* Introduced default temperature of 0.7 in __init__ method
* Updated JSON payload in submit_prompt method to include temperature


This change allows users to control the randomness of the model's output. 
If not specified, it defaults to 0.7, providing a balance between 
creativity and coherence in the generated text.
@AmitSinghShorthillsAI
Copy link
Contributor Author

@zainhoda Please review

@zainhoda zainhoda merged commit 8cc87a0 into vanna-ai:main Sep 17, 2024
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants