You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+7-47
Original file line number
Diff line number
Diff line change
@@ -26,10 +26,6 @@
26
26
27
27
[](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml)[](https://github.com/go-skynet/LocalAI/actions/workflows/release.yaml)[](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml)[](https://github.com/go-skynet/LocalAI/actions/workflows/bump_deps.yaml)[](https://artifacthub.io/packages/search?repo=localai)
28
28
29
-
**LocalAI** is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU.
<imgsrc="https://img.shields.io/twitter/follow/_LocalAI?label=Share Repo on Twitter&style=social"alt="Follow _LocalAI"/></a>
58
-
<ahref="https://t.me/share/url?text=Check%20this%20GitHub%20repository%20out.%20LocalAI%20-%20Let%27s%20you%20easily%20run%20LLM%20locally.&url=https://github.com/go-skynet/LocalAI"target="_blank"><imgsrc="https://img.shields.io/twitter/url?label=Telegram&logo=Telegram&style=social&url=https://github.com/go-skynet/LocalAI"alt="Share on Telegram"/></a>
<imgsrc="https://img.shields.io/twitter/url?label=Reddit&logo=Reddit&style=social&url=https://github.com/go-skynet/LocalAI"alt="Share on Reddit"/>
62
-
</a> <ahref="mailto:?subject=Check%20this%20GitHub%20repository%20out.%20LocalAI%20-%20Let%27s%20you%20easily%20run%20LLM%20locally.%3A%0Ahttps://github.com/go-skynet/LocalAI"target="_blank"><imgsrc="https://img.shields.io/twitter/url?label=Gmail&logo=Gmail&style=social&url=https://github.com/go-skynet/LocalAI"/></a> <ahref="https://www.buymeacoffee.com/mudler"target="_blank"><imgsrc="https://cdn.buymeacoffee.com/buttons/default-orange.png"alt="Buy Me A Coffee"height="23"width="100"style="border-radius:1px"></a>
**LocalAI** is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU.
If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22
79
54
80
-
81
-
82
-
<hr>
83
-
84
-
In a nutshell:
85
-
86
-
- Local, OpenAI drop-in alternative REST API. You own your data.
87
-
- NO GPU required. NO Internet access is required either
88
-
- Optional, GPU Acceleration is available in `llama.cpp`-compatible LLMs. See also the [build section](https://localai.io/basics/build/index.html).
89
-
- Supports multiple models
90
-
- 🏃 Once loaded the first time, it keep models loaded in memory for faster inference
91
-
- ⚡ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.
92
-
93
-
LocalAI was created by [Ettore Di Giacinto](https://github.com/mudler/) and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
94
-
95
-
Note that this started just as a [fun weekend project](https://localai.io/#backstory) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT`: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
Copy file name to clipboardExpand all lines: docs/content/_index.en.md
+5-44
Original file line number
Diff line number
Diff line change
@@ -24,8 +24,6 @@ title = "LocalAI"
24
24
25
25
**LocalAI** is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that's compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU. It is maintained by [mudler](https://github.com/mudler).
<imgsrc="https://img.shields.io/twitter/follow/_LocalAI?label=Share Repo on Twitter&style=social"alt="Follow _LocalAI"/></a>
54
-
<ahref="https://t.me/share/url?text=Check%20this%20GitHub%20repository%20out.%20LocalAI%20-%20Let%27s%20you%20easily%20run%20LLM%20locally.&url=https://github.com/go-skynet/LocalAI"target="_blank"><imgsrc="https://img.shields.io/twitter/url?label=Telegram&logo=Telegram&style=social&url=https://github.com/go-skynet/LocalAI"alt="Share on Telegram"/></a>
<imgsrc="https://img.shields.io/twitter/url?label=Reddit&logo=Reddit&style=social&url=https://github.com/go-skynet/LocalAI"alt="Share on Reddit"/>
58
-
</a> <ahref="mailto:?subject=Check%20this%20GitHub%20repository%20out.%20LocalAI%20-%20Let%27s%20you%20easily%20run%20LLM%20locally.%3A%0Ahttps://github.com/go-skynet/LocalAI"target="_blank"><imgsrc="https://img.shields.io/twitter/url?label=Gmail&logo=Gmail&style=social&url=https://github.com/go-skynet/LocalAI"/></a> <ahref="https://www.buymeacoffee.com/mudler"target="_blank"><imgsrc="https://cdn.buymeacoffee.com/buttons/default-orange.png"alt="Buy Me A Coffee"height="23"width="100"style="border-radius:1px"></a>
59
-
60
-
</p>
61
-
62
-
<hr>
63
-
64
35
In a nutshell:
65
36
66
37
- Local, OpenAI drop-in alternative REST API. You own your data.
@@ -70,9 +41,10 @@ In a nutshell:
70
41
- 🏃 Once loaded the first time, it keep models loaded in memory for faster inference
71
42
- ⚡ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.
72
43
73
-
LocalAI was created by [Ettore Di Giacinto](https://github.com/mudler/) and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
44
+
LocalAI is focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
45
+
46
+
Note that this started just as a fun weekend project by [mudler](https://github.com/mudler) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT`: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
74
47
75
-
Note that this started just as a [fun weekend project](https://localai.io/#backstory) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT`: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
76
48
77
49
## 🚀 Features
78
50
@@ -86,19 +58,6 @@ Note that this started just as a [fun weekend project](https://localai.io/#backs
86
58
- 🖼️ [Download Models directly from Huggingface ](https://localai.io/models/)
If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22
101
-
102
61
## How does it work?
103
62
104
63
LocalAI is an API written in Go that serves as an OpenAI shim, enabling software already developed with OpenAI SDKs to seamlessly integrate with LocalAI. It can be effortlessly implemented as a substitute, even on consumer-grade hardware. This capability is achieved by employing various C++ backends, including [ggml](https://github.com/ggerganov/ggml), to perform inference on LLMs using both CPU and, if desired, GPU. Internally LocalAI backends are just gRPC server, indeed you can specify and build your own gRPC server and extend LocalAI in runtime as well. It is possible to specify external gRPC server and/or binaries that LocalAI will manage internally.
@@ -139,6 +98,8 @@ LocalAI couldn't have been built without the help of great software already avai
139
98
-https://github.com/rhasspy/piper
140
99
-https://github.com/cmp-nct/ggllm.cpp
141
100
101
+
102
+
142
103
## Backstory
143
104
144
105
As much as typical open source projects starts, I, [mudler](https://github.com/mudler/), was fiddling around with [llama.cpp](https://github.com/ggerganov/llama.cpp) over my long nights and wanted to have a way to call it from `go`, as I am a Golang developer and use it extensively. So I've created `LocalAI` (or what was initially known as `llama-cli`) and added an API to it.
0 commit comments