Skip to content

HuggingFaceH4/starchat-alpha CPP LLM #1441

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Closed
ekolawole opened this issue May 14, 2023 · 3 comments
Closed

HuggingFaceH4/starchat-alpha CPP LLM #1441

ekolawole opened this issue May 14, 2023 · 3 comments
Labels

Comments

@ekolawole
Copy link

What we need is a local cpp for llm "HuggingFaceH4/starchat-alpha A". It is much better than any other open source chat models at coding. And it does very well at chatting too. This is better than Vicuna.

@chat-guy
Copy link

This would be great. It is indeed significantly better than Vicuna as far as code-generation is concerned.

@s-kostyaev
Copy link

Looks like we need to integrate https://github.com/ggerganov/ggml/tree/master/examples/starcoder into llama.cpp

Copy link
Contributor

github-actions bot commented Apr 9, 2024

This issue was closed because it has been inactive for 14 days since being marked as stale.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants