-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
[Feature]: Groq - deepseek-r1-distill-llama-70b #8071
Comments
As with all API providers that have OpenAI compatibility, you can get around LiteLLM model incompatibility issues by accessing the API provider through the OpenAI API format. The litellm parameters are: model="openai/deepseek-r1-distill-llama-70b", # note the `openai/` prefix
api_base="https://api.groq.com/openai/v1",
api_key=os.getenv("GROQ_API_KEY"), LiteLLM OpenAI Compatible-Endpoint Docs I've run these parameters on my end and the r1-distill works. |
What is the LiteLLM issue you are running into here ? |
Ah, one correction here. I assumed that the user was running into issues with the That said, I do an error when running
|
Ah, I didn't realize it would work. The documentation has a table so it wasn't clear to me that newly listed models should work "out of the box": https://docs.litellm.ai/docs/providers/groq#supported-models---all-groq-models-supported I just assumed it wouldn't work yet and was proactively making this issue in anticipation of using it. |
The Feature
This issue is to add deepseek R1 hosted on groq cloud
Motivation, pitch
It fast and it good. Apparently.
Are you a ML Ops Team?
No
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: