Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

power with llama cpp under the hood instead of ollama #6

Open
nischalj10 opened this issue Jun 3, 2024 · 1 comment
Open

power with llama cpp under the hood instead of ollama #6

nischalj10 opened this issue Jun 3, 2024 · 1 comment
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@nischalj10
Copy link
Owner

see for more details - nischalj10/headless-ollama#1

@nischalj10 nischalj10 added enhancement New feature or request help wanted Extra attention is needed labels Jun 3, 2024
@nischalj10
Copy link
Owner Author

update - llama.cpp has stopped supporting VLMs on its server. ggml-org/llama.cpp#5882

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant