Port autogpt.core.resource.model_provider
from AutoGPT to Forge
#7001
Labels
architecture
Topics related to package and system architecture
Forge
Roadmapped
Issues that were spawned by roadmap items
Actionable for 🚀 AutoGPT Roadmap - Empowering Agent Builders 👷 #6970
Proposed new module name:
forge.llm
Dependencies
forge.models.config
intoforge.models.config
andforge.models.state
#7000autogpt.core.utils.json_schema
from AutoGPT to Forge #7002TODO
autogpt.core.resource.model_provider
Notes
Configuration may need revision
We want Forge components to be portable and usable as stand-alone imports. Modules should be able to configure themselves if no configuration is passed in.
Example:
OpenAI
's constructor has anapi_key
parameter. If not set, it will try to read the API key from theOPENAI_API_KEY
environment variable.Our
OpenAIProvider
wraps anOpenAI
orAzureOpenAI
client, depending on the configuration. We think it makes sense to preserve this behavior.Why migrate this module?
The
model_provider
module provides functionality and extendability that is not available from any many-model client that we know of, e.g. LiteLLM. We would like to have support for as many models as possible, but:Because of these reasons, we want to keep our own client implementation for now.
The text was updated successfully, but these errors were encountered: