Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

help with vercel sdk #4674

Closed
3FrotoDev opened this issue Feb 3, 2025 · 2 comments
Closed

help with vercel sdk #4674

3FrotoDev opened this issue Feb 3, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@3FrotoDev
Copy link

3FrotoDev commented Feb 3, 2025

Description

and nothing repond i try to console reponce give me that

and nothing repond i try to console reponce give me that

Response {
status: 200,
statusText: '',
headers: Headers {
'Content-Type': 'text/plain; charset=utf-8',
'X-Vercel-AI-Data-Stream': 'v1'
},
body: ReadableStream { locked: false, state: 'readable', supportsBYOB: false },
bodyUsed: false,
ok: true,
redirected: false,
type: 'default',
url: ''
}

Code example

import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: openai('gpt-4-turbo'),
    system: 'You are a helpful assistant.',
    messages,
  });
  console.log(result.response);
  return result.toDataStreamResponse();
}

app/api/chat/route.tsx :
https://www.codebin.cc/code/cm6ow886c0001jl03nr6bvtuv:5vUaFjyKdGLEUhgcqHYrDCkd1MccVkL5W69vWkoXe6sc

app/chat/page.tsx:
https://www.codebin.cc/code/cm6ow97iv0001l2031qix4xb2:Cu13SzXkuQwwsS96pUSVdXk7B17jNEGM7hgteLe188zD

AI provider

"@ai-sdk/openai": "^1.1.9",

Additional context

No response

@3FrotoDev 3FrotoDev added the bug Something isn't working label Feb 3, 2025
@lgrammel
Copy link
Collaborator

lgrammel commented Feb 5, 2025

Can you check that your api key etc works using a simple generate text, e.g. https://github.com/vercel/ai/blob/main/examples/ai-core/src/generate-text/openai.ts

@lgrammel
Copy link
Collaborator

lgrammel commented Feb 6, 2025

I have introduced an onError callback on streamText: #4729 (ai@4.1.22)

streamText immediately starts streaming to enable sending data without waiting for the model.
Errors become part of the stream and are not thrown to prevent e.g. servers from crashing.

To log errors, you can provide an onError callback that is triggered when an error occurs.

import { streamText } from 'ai';

const result = streamText({
  model: yourModel,
  prompt: 'Invent a new holiday and describe its traditions.',
  onError({ error }) {
    console.error(error); // your error logging logic here
  },
});

@lgrammel lgrammel closed this as completed Feb 6, 2025
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants