-
-
Notifications
You must be signed in to change notification settings - Fork 2.7k
feat: do not bundle llama-cpp anymore #5790
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Conversation
✅ Deploy Preview for localai ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
4980a37
to
608264c
Compare
d1569f2
to
f3b1c38
Compare
So a completely separate Dockerfile and Makefile? This will be a major improvement! |
yup! my plan is to isolate everything, one backend at a time. Currently the llama.cpp one is the most heavy, having also lots of specific code in the golang part - ideally I want to get rid of all of the specific llama.cpp code and the binary bundling bits out of the main code. This is how I'm testing things now with #5816 in:
|
4005854
to
630fdba
Compare
0c1529b
to
ae65455
Compare
1072662
to
c90a0e8
Compare
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
fe05d6b
to
b038c5a
Compare
yes actually good point, my plan is to remove all the backends outside so we can build LocalAI in a simpler way by using standard golang tooling. At that point I will re-work documentation, at this stage is not really functional and in a "transient" state. However, for now the steps are the same:
The difference is in how backends are built. If you want llama-cpp, for example, you can install it from the Backends tab in the webui, or with
This does the following:
|
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
b038c5a
to
feecf58
Compare
Description
This PR fixes #
Notes for Reviewers
Signed commits