This build will experience a 4096 memory input too large error with pasting of a large prompt into the UI
updates at Professor Codephreak v2
This Professor Codephreak build is being left intact as the first example of the working aGLM prototype
work continues at automindx as the next evolution of Professor Codephreak's v1 core intention value of creating AUTOMINDx
To install right click "Save link as ..." automind.install chmod +x automind.install && automind.install
detailed and verbose procedure
- Right-click the following link: automind.install
- Choose "Save link as..." or "Download linked file" from the context menu.
- Select a location on your computer to save the file.
- from the terminal
- chmod +x automind.install && ./automind.install
---------------------------------
Delivers llama2-7b-chat-codeCherryPop-qLoRA-GGML
Example loading automind using Ubuntu 22.04LTS
Creates Professor Codephreak
Professor Codephreak is an expert in machine learning, computer science and computer programming
codephreak agenda: to create AUTOMINDx autonomous deployment
default model llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
chmod +x Miniconda3-latest-Linux-x86_64.sh
sudo ./Miniconda3-latest-Linux-x86_64.sh
#reload the shell configuration settings change for your shell example is for bash
source ~/.bashrc
conda create --name automind python=3.9.1
#replace with bash with your shell
conda activate bash
#conda reference for env display and quit
conda env list
conda list
conda info package_name
conda deactivate
git clone https://github.com/Professor-Codephreak/automind
cd automind
#install pip if you haven't already
sudo apt install python3-pip
#display version of pip installed
pip3 --version
#install automind requirements with pip
pip install -r requirements.txt
python3 uiux.py --model_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --tokenizer_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --model_type="ggml" --save_history --file_name="llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin"
python3 uiux.py --model_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --tokenizer_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --model_type="ggml" --save_history --file_name="llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin
# file structure
model_name.txt in the models folder with your model will autoread. the above call to uiux.py over-rides the call to model_name.txt
models folder = models
memory folder = memory
# troubleshooting
llamacpp source build dependencies include
sudo apt-get install build-essential cmake gcc g++ git python3-dev python3-pip libstdc++6 make pkg-config
# git and wget
sudo apt-get install git wget
# manual llamacpp pip install and uninstall
pip uninstall llama-cpp-python
pip install llama-cpp-python
# On Ubuntu 22.04.6 automind.install works with cmake version 3.27.2
cmake --version
# On Ubuntu 22.04.6 automind.install works with gcc (Ubuntu 11.4.0-1ubuntu1`22.04) 11.4.0
gcc --version
# config gcc alternatives
sudo update-alternatives --config gcc
# install pip3
sudo apt-get install python3-pip
pip3 --version
# diagnostics
sudo apt-get install hardinfo htop nvtop
To install right click "Save link as ..." automind.install chmod +x automind.install && automind.install
details and verbose procedure
- Right-click the following link: automind.install
- Choose "Save link as..." or "Download linked file" from the context menu.
- Select a location on your computer to save the file.
- from the terminal
- chmod +x automind.install && ./automind.install
sudo apt-get install build-essential cmake gcc g++ git python3-dev python3-pip libstdc++6 make pkg-config
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
chmod +x Miniconda3-latest-Linux-x86_64.sh
sudo ./Miniconda3-latest-Linux-x86_64.sh
source ~/.bashrc
conda create --name automind python=3.9.1
conda init
source ~/.bashrc
conda activate bash
git clone https://github.com/Professor-Codephreak/automind
cd automind
#install pip if you haven't already
sudo apt install python3-pip
#display version of pip installed
pip3 --version
#install automind requirements with pip
pip install -r requirements.txt
# RUN codephreak
python3 uiux.py --model_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --tokenizer_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --model_type="ggml" --save_history --file_name="llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin"