[Bug]: litellm_proxy
provider is not supported for embeddings
#8077
Labels
bug
Something isn't working
litellm_proxy
provider is not supported for embeddings
#8077
What happened?
As per the documentation for embeddings and LiteLLM Proxy, I'm using the following code to generate embeddings.
(where model is
nomic-embed-text
which is correctly defined in my proxy).However, because the
litellm_proxy
provider is never accounted for in the embedding method (see: https://github.com/BerriAI/litellm/blob/main/litellm/main.py#L3300-L3784), I get the following error:One workaround is to ignore the
litellm_proxy
and passopenai
as a custom llm provider, i.e.while this works, if feels extremely nasty, and it's weird that communication between the litellm client and proxy is not supported. Is this intentional ?
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.59.8
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: