Three basic parameters adapted to mainstream platforms and models #193
chenxizhang
started this conversation in
Use cases
Replies: 0 comments
# for free
to join this conversation on GitHub.
Already have an account?
# to comment
-
The initial intent of this module design is to eliminate the differences between mainstream platforms and models on the market, enabling users to engage in a unified ChatGPT experience or text/image generation experiences. To achieve this goal, we have designed around three basic parameters to ensure compatibility with almost all mainstream platforms and models.
Currently, the mainstream platforms and models we support include:
OpenAI
,Azure OpenAI
,Kimi
,Zhìpǔ Qīngyán
,DBRX
, and even locally supported models usingollama
, in addition to any other platforms or models compatible with OpenAI.The three key parameters mentioned here are:
api_key
This refers to the key required by the platform or model. It is recommended to store it using the
OPENAI_API_KEY
environment variable.endpoint
This refers to the service endpoint address used by the platform or model. It is recommended to store it using the
OPENAI_API_ENDPOINT
environment variable.model
This refers to the model you need to use, and it is recommended to store it using the
OPENAI_API_MODEL
environment variable with the default value beinggpt-3.5-turbo
.If you have fully defined the above three environment variables, then using the
chat
orgpt
functionalities becomes straightforward without needing to worry about these details—you simply focus on what functionality you want to achieve. Otherwise, you will need to specify them during each call—this can also be acceptable, for example, if you are using them in an automated system where you might prefer to specify parameters directly in the commands rather than relying on external environment variables. Thus, you can choose the approach most suitable for your circumstances.Beta Was this translation helpful? Give feedback.
All reactions