This project provides a free and local alternative to cloud-based language models. It offers a fully local experience of LLM Chat, Retrieval Augmented Generation App, and a Vector Database Chat. Run the streamlit app locally and create your own Knowledge Base.
Before you get started, make sure you have the following:
- Python 3.7 or higher installed.
- Mistral 7B installed.
To set up the project on your local machine, follow these steps:
- Install Mistral 7B for your operating system.
- Clone the repository to your local machine.
- Install the required dependencies using
pip install -r requirements.txt
. - Run the streamlit application using
streamlit run main.py
. - Open the application in your browser
- Enjoy!
The project is structured as follows:
├── main.py # Main script to run the project.
├── notebooks/ # Jupyter notebooks for instructions and test
├── README.md # This file.
├── requirements.txt # List of dependencies.
└── rslt_logo_dark_mode.png # Image used in the application.
The data used in this project is stored locally. It consists of a large corpus of text data, which is used to generate responses in the LLM Chat, Retrieval Augmented Generation App, and Vector Database Chat.
The following resources were used in the development of this project:
- Langchain: https://www.langchain.com/
- Mistral AI: https://mistral.ai/
- Streamlit: https://streamlit.io/
- ChromaDB: https://www.trychroma.com/
This project provides a free and local alternative to cloud-based language models. It offers a fully local experience of LLM Chat, Retrieval Augmented Generation App, and a Vector Database Chat. We hope you find it useful and look forward to your contributions!
Contributions are always welcome! If you have any ideas or suggestions, feel free to open an issue or submit a pull request.