You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think we should be able to give more long-form explanations on how llm-load-test is expected to behave with each back-end and endpoint configuration and a much more detailed explanation of the parameters available for the user to modify/tune in the config file. Adding complete explanations of each of the parameters would ruin the readability of the config file.
Example:
format_prompt is a parameter available under dataset in the config that a user will never be aware of unless they dig around the code. It has never been part of the config before. If I were to add information like this, the config would quickly become unreadable.
The text was updated successfully, but these errors were encountered:
I think we should be able to give more long-form explanations on how llm-load-test is expected to behave with each back-end and endpoint configuration and a much more detailed explanation of the parameters available for the user to modify/tune in the config file. Adding complete explanations of each of the parameters would ruin the readability of the config file.
Example:
format_prompt is a parameter available under dataset in the config that a user will never be aware of unless they dig around the code. It has never been part of the config before. If I were to add information like this, the config would quickly become unreadable.
The text was updated successfully, but these errors were encountered: