You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Are you asking about how to change the language models to use models other than gpt?
For the example in README.md,
importosfromknowledge_stormimportSTORMWikiRunnerArguments, STORMWikiRunner, STORMWikiLMConfigsfromknowledge_storm.lmimportOpenAIModelfromknowledge_storm.rmimportYouRMlm_configs=STORMWikiLMConfigs()
openai_kwargs= {
'api_key': os.getenv("OPENAI_API_KEY"),
'temperature': 1.0,
'top_p': 0.9,
}
# STORM is a LM system so different components can be powered by different models to reach a good balance between cost and quality.# For a good practice, choose a cheaper/faster model for `conv_simulator_lm` which is used to split queries, synthesize answers in the conversation.# Choose a more powerful model for `article_gen_lm` to generate verifiable text with citations.gpt_35=OpenAIModel(model='gpt-3.5-turbo', max_tokens=500, **openai_kwargs)
gpt_4=OpenAIModel(model='gpt-4o', max_tokens=3000, **openai_kwargs)
lm_configs.set_conv_simulator_lm(gpt_35)
lm_configs.set_question_asker_lm(gpt_35)
lm_configs.set_outline_gen_lm(gpt_4)
lm_configs.set_article_gen_lm(gpt_4)
lm_configs.set_article_polish_lm(gpt_4)
# Check out the STORMWikiRunnerArguments class for more configurations.engine_args=STORMWikiRunnerArguments(...)
rm=YouRM(ydc_api_key=os.getenv('YDC_API_KEY'), k=engine_args.search_top_k)
runner=STORMWikiRunner(engine_args, lm_configs, rm)
you can change gpt_35, gpt_4 to the model you want to use and set them in lm_configs. All supported language models can be found in knowledge_storm/lm.py
Is it possible to define some gpt by myself and use it? If so, which part of the code or configuration needs to be modified?
The text was updated successfully, but these errors were encountered: