diff --git a/.changeset/perfect-kangaroos-drop.md b/.changeset/perfect-kangaroos-drop.md new file mode 100644 index 000000000000..57a16f30617b --- /dev/null +++ b/.changeset/perfect-kangaroos-drop.md @@ -0,0 +1,5 @@ +--- +'@ai-sdk/groq': patch +--- + +feat (provider/groq): add deepseek r1 diff --git a/content/providers/01-ai-sdk-providers/50-groq.mdx b/content/providers/01-ai-sdk-providers/50-groq.mdx index ac4eef77a0a7..085d393ce011 100644 --- a/content/providers/01-ai-sdk-providers/50-groq.mdx +++ b/content/providers/01-ai-sdk-providers/50-groq.mdx @@ -75,6 +75,26 @@ The first argument is the model id, e.g. `gemma2-9b-it`. const model = groq('gemma2-9b-it'); ``` +### Reasoning Models + +Groq exposes the thinking of `deepseek-r1-distill-llama-70b` in the generated text using the `` tag. +You can use the `extractReasoningMiddleware` to extract this reasoning and expose it as a `reasoning` property on the result: + +```ts +import { groq } from '@ai-sdk/groq'; +import { + experimental_wrapLanguageModel as wrapLanguageModel, + extractReasoningMiddleware, +} from 'ai'; + +const enhancedModel = wrapLanguageModel({ + model: groq('deepseek-r1-distill-llama-70b'), + middleware: extractReasoningMiddleware({ tagName: 'think' }), +}); +``` + +You can then use that enhanced model in functions like `generateText` and `streamText`. + ### Example You can use Groq language models to generate text with the `generateText` function: @@ -91,12 +111,13 @@ const { text } = await generateText({ ## Model Capabilities -| Model | Image Input | Object Generation | Tool Usage | Tool Streaming | -| ------------------------- | ------------------- | ------------------- | ------------------- | ------------------- | -| `llama-3.3-70b-versatile` | | | | | -| `llama-3.1-8b-instant` | | | | | -| `gemma2-9b-it` | | | | | -| `mixtral-8x7b-32768` | | | | | +| Model | Image Input | Object Generation | Tool Usage | Tool Streaming | +| ------------------------------- | ------------------- | ------------------- | ------------------- | ------------------- | +| `deepseek-r1-distill-llama-70b` | | | | | +| `llama-3.3-70b-versatile` | | | | | +| `llama-3.1-8b-instant` | | | | | +| `gemma2-9b-it` | | | | | +| `mixtral-8x7b-32768` | | | | | The table above lists popular models. Please see the [Grop diff --git a/packages/groq/src/groq-chat-settings.ts b/packages/groq/src/groq-chat-settings.ts index 8bc3ba5d959d..80c4722dc72c 100644 --- a/packages/groq/src/groq-chat-settings.ts +++ b/packages/groq/src/groq-chat-settings.ts @@ -1,6 +1,7 @@ // https://console.groq.com/docs/models // production models export type GroqChatModelId = + | 'deepseek-r1-distill-llama-70b' | 'gemma2-9b-it' | 'gemma-7b-it' | 'llama-3.3-70b-versatile'