This is a NextJS framework with Patternfly UI library components.
Set the .env in the ui directory and run the following:
cd ui/
npm install
npm run dev
# or for prod
npm run build
npm run start
# Run a production build (outputs to ".next" dir)
npm run build
# Start the Next.js server (run a production build)
npm run start
# Lint the project
npm run lint
# Automatically fix linting issues
npm run lint:fix
# Format code using Prettier
npm run pretty
# Run type checking
npm run type-check
Podman is a requirement. Install and init instructions here.
# Run markdown linter
make md-lint
Set the .env in the ui directory and make sure you uncomment the IL_UI_DEPLOYMENT=dev
. Once .env file is setup, run the following:
make start-dev-local
This will start the UI and the dependent pathservice locally on the machine.
Note
It might ask for permission to allow to listen on port 4000.
To stop the the local dev environment run the following:
make stop-dev-local
Set the .env in the ui directory and run the following:
make start-dev-kind
This will start the Kind cluster and deploy the UI stack related manifest files in the cluster.
To stop the Kind cluster and delete the UI stack related resources, run the following:
make stop-dev-kind
Use make help
to see all the available commands.
You can either set up the Oauth app in your
GitHub
account or use the user/pass defined in .env
. To change those defaults, create
the /ui/.env
file and fill in the account user/pass with the following.
Example .env file.
For the chat functionality to work you need a ilab model chat instance. To run this locally:
cd server
https://github.com/instructlab/instructlab?tab=readme-ov-file#-getting-started
After you use the ilab serve
command you should have, by default, a chat server instance running on port 8000.
- The docker image that runs the server does not utilise Mac Metal GPU and therefore is very slow when answering prompts
- The docker image is very large as it contains the model itself. Potential to have the model incorporated via a docker volume to reduce the size of the actual image.
docker run -p 8000:8000 aevo987654/instructlab_chat_8000:v2
This should run a server on port 8000
Return back to the root of the repo (ui) and run npm run dev
and visit http://localhost:3000/playground/endpoints.
Click the Add Endpoint
button and a popup modal will appear.
- URL - add
http://127.0.0.1:8000
- Model Name - add
merlinite-7b-lab-Q4_K_M.gguf
- API Key - add some random characters
Click the Save
button
Go to the chat interface http://localhost:3000/playground/chat and select the merlinite-7b-lab-Q4_K_M.gguf
model.
The chat interface should now use the server.