Skip to content

Commit

Permalink
Merge pull request #412 from miurla/feat/refactor-streaming-for-manua…
Browse files Browse the repository at this point in the history
…l-tools

feat: Add reasoning model support
  • Loading branch information
miurla authored Feb 1, 2025
2 parents 6488bee + 0c1bd61 commit 352441e
Show file tree
Hide file tree
Showing 46 changed files with 1,404 additions and 341 deletions.
3 changes: 3 additions & 0 deletions .env.local.example
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,9 @@ TAVILY_API_KEY=[YOUR_TAVILY_API_KEY] # Get your API key at: https://app.tavily.
# DeepSeek
# DEEPSEEK_API_KEY=[YOUR_DEEPSEEK_API_KEY]

# Fireworks
# FIREWORKS_API_KEY=[YOUR_FIREWORKS_API_KEY]

# OpenAI Compatible Model
# NEXT_PUBLIC_OPENAI_COMPATIBLE_MODEL=
# OPENAI_COMPATIBLE_API_KEY=
Expand Down
14 changes: 9 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

An AI-powered search engine with a generative UI.

![capture](/public/screenshot-2025-01-15.png)
![capture](/public/screenshot-2025-01-31.png)

## 🗂️ Overview

Expand All @@ -24,6 +24,7 @@ An AI-powered search engine with a generative UI.
- Natural language question understanding
- Multiple search providers support (Tavily, SearXNG, Exa)
- Model selection from UI (switch between available AI models)
- Reasoning models with visible thought process

### Chat & History

Expand All @@ -40,6 +41,7 @@ An AI-powered search engine with a generative UI.
- Ollama
- Groq
- DeepSeek
- Fireworks
- OpenAI Compatible

### Search Capabilities
Expand Down Expand Up @@ -68,7 +70,7 @@ An AI-powered search engine with a generative UI.

### AI & Search

- [OpenAI](https://openai.com/) - Default AI provider (Optional: Google AI, Anthropic, Groq, Ollama, Azure OpenAI)
- [OpenAI](https://openai.com/) - Default AI provider (Optional: Google AI, Anthropic, Groq, Ollama, Azure OpenAI, DeepSeek, Fireworks)
- [Tavily AI](https://tavily.com/) - Default search provider
- Alternative providers:
- [SearXNG](https://docs.searxng.org/) - Self-hosted search
Expand Down Expand Up @@ -176,13 +178,15 @@ This will allow you to use Morphic as your default search engine in the browser.
- Gemini 2.0 Flash (Experimental)
- Anthropic
- Claude 3.5 Sonnet
- Claude 3.5 Hike
- Ollama
- qwen2.5
- deepseek-r1
- Groq
- llama3-groq-8b-8192-tool-use-preview
- llama3-groq-70b-8192-tool-use-preview
- deepseek-r1-distill-llama-70b
- DeepSeek
- DeepSeek v3 [(Unstable)](https://github.com/vercel/ai/issues/4313#issuecomment-2587891644)
- DeepSeek V3
- DeepSeek R1

## ⚡ AI SDK Implementation

Expand Down
126 changes: 19 additions & 107 deletions app/api/chat/route.ts
Original file line number Diff line number Diff line change
@@ -1,15 +1,6 @@
import { getChat, saveChat } from '@/lib/actions/chat'
import { generateRelatedQuestions } from '@/lib/agents/generate-related-questions'
import { researcher } from '@/lib/agents/researcher'
import { ExtendedCoreMessage } from '@/lib/types'
import { convertToExtendedCoreMessages } from '@/lib/utils'
import { isProviderEnabled } from '@/lib/utils/registry'
import {
convertToCoreMessages,
createDataStreamResponse,
JSONValue,
streamText
} from 'ai'
import { createManualToolStreamResponse } from '@/lib/streaming/create-manual-tool-stream'
import { createToolCallingStreamResponse } from '@/lib/streaming/create-tool-calling-stream'
import { isProviderEnabled, isToolCallSupported } from '@/lib/utils/registry'
import { cookies } from 'next/headers'

export const maxDuration = 30
Expand All @@ -29,112 +20,33 @@ export async function POST(req: Request) {
})
}

const coreMessages = convertToCoreMessages(messages)
const extendedCoreMessages = convertToExtendedCoreMessages(messages)

const cookieStore = await cookies()
const modelFromCookie = cookieStore.get('selected-model')?.value
const searchMode = cookieStore.get('search-mode')?.value === 'true'
const model = modelFromCookie || DEFAULT_MODEL
const provider = model.split(':')[0]

if (!isProviderEnabled(provider)) {
return new Response(`Selected provider is not enabled ${provider}`, {
status: 404,
statusText: 'Not Found'
})
}

return createDataStreamResponse({
execute: async dataStream => {
try {
let researcherConfig
try {
researcherConfig = await researcher({
messages: coreMessages,
model
})
} catch (error) {
console.error('Researcher configuration error:', error)
throw new Error('Failed to initialize researcher configuration')
}

const result = streamText({
...researcherConfig,
onFinish: async event => {
try {
const responseMessages = event.response.messages

let annotation: JSONValue = {
type: 'related-questions',
data: {
items: []
}
}

// Notify related questions loading
dataStream.writeMessageAnnotation(annotation)

// Generate related questions
const relatedQuestions = await generateRelatedQuestions(
responseMessages,
model
)

// Update the annotation with the related questions
annotation = {
...annotation,
data: relatedQuestions.object
}

// Send related questions to client
dataStream.writeMessageAnnotation(annotation)

// Create the message to save
const generatedMessages = [
...extendedCoreMessages,
...responseMessages.slice(0, -1),
{
role: 'data',
content: annotation
},
responseMessages[responseMessages.length - 1]
] as ExtendedCoreMessage[]

// Get the chat from the database if it exists, otherwise create a new one
const savedChat = (await getChat(chatId)) ?? {
messages: [],
createdAt: new Date(),
userId: 'anonymous',
path: `/search/${chatId}`,
title: messages[0].content,
id: chatId
}

// Save chat with complete response and related questions
await saveChat({
...savedChat,
messages: generatedMessages
}).catch(error => {
console.error('Failed to save chat:', error)
throw new Error('Failed to save chat history')
})
} catch (error) {
console.error('Error in onFinish:', error)
throw error
}
}
})

result.mergeIntoDataStream(dataStream)
} catch (error) {
console.error('Stream execution error:', error)
}
},
onError: error => {
console.error('Stream error:', error)
return error instanceof Error ? error.message : String(error)
}
})
const supportsToolCalling = isToolCallSupported(model)

return supportsToolCalling
? createToolCallingStreamResponse({
messages,
model,
chatId,
searchMode
})
: createManualToolStreamResponse({
messages,
model,
chatId,
searchMode
})
} catch (error) {
console.error('API route error:', error)
return new Response(
Expand Down
8 changes: 8 additions & 0 deletions app/globals.css
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,10 @@
--chart-4: 43 74% 66%;

--chart-5: 27 87% 67%;

--accent-blue: 210 100% 97%;
--accent-blue-foreground: 210 100% 50%;
--accent-blue-border: 210 100% 90%;
}

.dark {
Expand Down Expand Up @@ -78,6 +82,10 @@
--chart-3: 30 80% 55%;
--chart-4: 280 65% 60%;
--chart-5: 340 75% 55%;

--accent-blue: 210 100% 10%;
--accent-blue-foreground: 210 100% 80%;
--accent-blue-border: 210 100% 25%;
}
}

Expand Down
Binary file modified bun.lockb
Binary file not shown.
12 changes: 7 additions & 5 deletions components/answer-section.tsx
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
'use client'

import { Text } from 'lucide-react'
import { ChatShare } from './chat-share'
import { CollapsibleMessage } from './collapsible-message'
import { DefaultSkeleton } from './default-skeleton'
import { BotMessage } from './message'
import { MessageActions } from './message-actions'

export type AnswerSectionProps = {
content: string
Expand All @@ -30,17 +30,19 @@ export function AnswerSection({
const message = content ? (
<div className="flex flex-col gap-1">
<BotMessage message={content} />
{enableShare && chatId && (
<ChatShare chatId={chatId} className="self-end" />
)}
<MessageActions
message={content}
chatId={chatId}
enableShare={enableShare}
/>
</div>
) : (
<DefaultSkeleton />
)
return (
<CollapsibleMessage
role="assistant"
isCollapsible={true}
isCollapsible={false}
header={header}
isOpen={isOpen}
onOpenChange={onOpenChange}
Expand Down
63 changes: 58 additions & 5 deletions components/chat-messages.tsx
Original file line number Diff line number Diff line change
@@ -1,38 +1,80 @@
import { Message } from 'ai'
import { useEffect, useState } from 'react'
import { JSONValue, Message } from 'ai'
import { useEffect, useMemo, useRef, useState } from 'react'
import { RenderMessage } from './render-message'
import { ToolSection } from './tool-section'
import { Spinner } from './ui/spinner'

interface ChatMessagesProps {
messages: Message[]
data: JSONValue[] | undefined
onQuerySelect: (query: string) => void
isLoading: boolean
chatId?: string
}

export function ChatMessages({
messages,
data,
onQuerySelect,
isLoading,
chatId
}: ChatMessagesProps) {
const [openStates, setOpenStates] = useState<Record<string, boolean>>({})
const manualToolCallId = 'manual-tool-call'

// Add ref for the messages container
const messagesEndRef = useRef<HTMLDivElement>(null)

// Scroll to bottom function
const scrollToBottom = () => {
messagesEndRef.current?.scrollIntoView({ behavior: 'instant' })
}

// Scroll to bottom on mount and when messages change
useEffect(() => {
scrollToBottom()
}, [])

useEffect(() => {
const lastMessage = messages[messages.length - 1]
if (lastMessage?.role === 'user') {
setOpenStates({})
setOpenStates({ [manualToolCallId]: true })
}
}, [messages])

// get last tool data for manual tool call
const lastToolData = useMemo(() => {
if (!data || !Array.isArray(data) || data.length === 0) return null

const lastItem = data[data.length - 1] as {
type: 'tool_call'
data: {
toolCallId: string
state: 'call' | 'result'
toolName: string
args: string
}
}

if (lastItem.type !== 'tool_call') return null

const toolData = lastItem.data
return {
state: 'call' as const,
toolCallId: toolData.toolCallId,
toolName: toolData.toolName,
args: toolData.args ? JSON.parse(toolData.args) : undefined
}
}, [data])

if (!messages.length) return null

const lastUserIndex =
messages.length -
1 -
[...messages].reverse().findIndex(msg => msg.role === 'user')

const showSpinner = isLoading && messages[messages.length - 1].role === 'user'
const showLoading = isLoading && messages[messages.length - 1].role === 'user'

const getIsOpen = (id: string) => {
const baseId = id.endsWith('-related') ? id.slice(0, -8) : id
Expand Down Expand Up @@ -61,7 +103,18 @@ export function ChatMessages({
/>
</div>
))}
{showSpinner && <Spinner />}
{showLoading &&
(lastToolData ? (
<ToolSection
key={manualToolCallId}
tool={lastToolData}
isOpen={getIsOpen(manualToolCallId)}
onOpenChange={open => handleOpenChange(manualToolCallId, open)}
/>
) : (
<Spinner />
))}
<div ref={messagesEndRef} /> {/* Add empty div as scroll anchor */}
</div>
)
}
2 changes: 2 additions & 0 deletions components/chat-panel.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ import { useEffect, useRef, useState } from 'react'
import Textarea from 'react-textarea-autosize'
import { EmptyScreen } from './empty-screen'
import { ModelSelector } from './model-selector'
import { SearchModeToggle } from './search-mode-toggle'
import { Button } from './ui/button'
import { IconLogo } from './ui/icons'

Expand Down Expand Up @@ -130,6 +131,7 @@ export function ChatPanel({
<div className="flex items-center justify-between p-3">
<div className="flex items-center gap-2">
<ModelSelector />
<SearchModeToggle />
</div>
<div className="flex items-center gap-2">
{messages.length > 0 && (
Expand Down
Loading

0 comments on commit 352441e

Please # to comment.