Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Fix: Add simulated streaming support for Ollama provider #402

Merged
merged 1 commit into from
Jan 18, 2025

Conversation

miurla
Copy link
Owner

@miurla miurla commented Jan 18, 2025

fix: #401

Note

Ollama provider v1.2.0 supports simulateStreaming

https://github.com/sgomez/ollama-ai-provider/releases/tag/ollama-ai-provider%401.2.0

Screenshot

image

  • model: qwen2.5

Copy link

vercel bot commented Jan 18, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
morphic ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 18, 2025 5:34am

@miurla miurla merged commit 32eb8f6 into main Jan 18, 2025
2 checks passed
@miurla miurla deleted the fix/simulate-streaming-for-ollama branch January 18, 2025 05:37
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] ollama replies do not show up in 0.3.0
1 participant