-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Support custom system message template #2069
Conversation
conv.system is overwritten when inferencing from other modules/files. For example in openai_api_server : https://github.com/lm-sys/FastChat/blob/4e2c942b8d785eb5e2aef1d0df2150e756f381ab/fastchat/serve/openai_api_server.py#L252C18-L252C18 Now, they need to use conv.system --> conv.system_msg. |
@sarathkondeti That's true. I would try to fix it. However, it appears there's an issue with the templates. I am unfamiliar with how each model is aligned with SFT. Hence, I'm uncertain whether the system prompt is fully predetermined, as with Stanford Alpaca, partially fixed, like Llama 2, or can be arbitrarily set, akin to one_shot and zero_shot during the fine-tuning process. This necessitates further investigation. |
Replace all `.system =` with `.set_system_msg`
In my testing I've been able to arbitrarily change the system message and get useful results that are obviously influenced by it. |
@merrymercy Sorry for that, it seems that I cannot allow the editing by maintainers due to the limitation from GitHub (https://github.com/orgs/community/discussions/5634) cause this pr is from a organisation instead of individual account. I didn't expect this problem. It seems that you have created a new PR for that, so it is just fine to close this one for further modification. |
Why are these changes needed?
It makes the conversation class more adaptable, allowing for further customization of the chat.
Results in
Related issue number (if applicable)
Checks
format.sh
to lint the changes in this PR.