You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I setup llm-ui following the quick start guide at https://llm-ui.com/docs/quick-start exactly and streaming mostly works, however when it starts to stream a code block, it first waits until the entire code block is finished before displaying it. If its a lot of code, it just looks empty for a few seconds.
Is there some additional setting that I need that is not mentioned in the quickstart guide?
I setup llm-ui following the quick start guide at https://llm-ui.com/docs/quick-start exactly and streaming mostly works, however when it starts to stream a code block, it first waits until the entire code block is finished before displaying it. If its a lot of code, it just looks empty for a few seconds.
Is there some additional setting that I need that is not mentioned in the quickstart guide?
Here is my useLLMOutput hook:
and here is my CodeBlock component:
However when I try the demo at https://llm-ui.com/chat, it does stream code blocks. So what setting am I missing?
The text was updated successfully, but these errors were encountered: