LAIB# Local AI Bash is an interactive terminal application that integrates a local AI model for natural language-based command generation. It includes features like whitelisting, blacklisting, and manual command review to ensure safety and usability. Currently, LAIB# has been tested only on Linux, but may work on other operating systems too, with small code changes.
- Generate commands using natural language queries.
- Uses local LLM Models (LMStudio https://lmstudio.ai/ required).
- Manage whitelisted (allowed) and blacklisted (blocked) commands.
- Review and edit blocked AI-generated commands before execution.
- Infinite loops and common 'bad' commands protection.
- Automatically execute generated commands in a real terminal.
- Integrated terminal for manual command entry.
- Other menus and search: Access menu for customization, and searching through terminal output.
To use this application, ensure you have:
- Python 3.8+ installed on your system.
- LMStudio running locally to interact with the AI model (https://lmstudio.ai/).
- A LLM Model downloaded from LMStudio (see suggestions below).
-
LLM Model:
bashcopilot-6b-preview
- This model is 'bash' oriented: it mostly generates the right command, but sometimes, on higher values, it will try to build scripts, or long oneliners.
- Other general bigger models, like Llama, may work better in understanding the exact command, and less better in bash scripting.
- Avoiding 'uncensored' models may prevent bad commands to be generated.
-
These settings have been tested with good results on other models. As a general suggestion: keep values low, especially on bigger models:
- Context Length:
200-400
- Temperature:
0.45 - 0.6
- Response Length Limit:
50-250 tokens
- Top-K Sampling:
40
- Repeat Penalty:
1.1
- Top-P Sampling:
0.95
- Minimum P Sampling:
0.05
- Context Length:
- Clone the Repository
$ git clone https://github.com/davidegat/LAIB-Local-AI-Bash.git
$ cd LAIB-Local-AI-Bash
- Set Up a Virtual Environment (Optional)
$ python3 -m venv venv
$ source venv/bin/activate
- Install Required Libraries using the following command:
$ pip install tkinter requests tkterm
If not included in your installation, you may also need: threading
, os
, re
, json
, queue
- Install LMStudio
- Download and install LMStudio.
- Download and load the suggested model.
- Configure LMStudio to run at
http://127.0.0.1:1234
.
You can also set your favourite LMStudio Endpoint via settings
menu, to access a custom local or remote LMStudio API.
Ensure LMStudio is running before starting the application. Then execute:
$ python laib.py
- Terminal Frame: Interactive terminal for direct Bash commands and output display.
- Query Box: Input natural language queries to generate commands.
- Menu Bar:
- Settings: Edit whitelist, blacklist, LMStudio Endpoint.
- Help: Access user guide and about section.
- Debug: Shows LMStudio console to monitor endpoint
- Reset Cache Button: Clears AI command cache.
- Customization and Search:
- Use top-right menu to and search through terminal output.
- Access context menu in terminal by right-clicking.
- Enter a query in the query box, e.g.,
Show files in current dir
- AI will generate a command, which will be checked against whitelist/blacklist.
- If approved, command is executed automatically in terminal.
- If blocked, a review window allows you to edit or approve the command.
- In some cases, command blocks are hardcoded (loops, sudo).
- Whitelist: Commands that bypass review and execute directly.
- Blacklist: Commands requiring review before execution.
- Access these lists from
Settings
menu.
- Temporarily stores generated commands for faster reuse.
- Use
Reset Cache
button to clear this cache.
- Use at own risk: most dangerous commands are blocked, but no one can guarantee that all AI generated commands will do no harm. Do not use on important or production contexts!
- Commands starting with
sudo
are not supported by tkterm library and are blocked for safety.- Run this software as root instead (not recommended).
- Direct input in terminal bypasses safety checks; use with caution.
- Supported Shells: Currently, only Bash is supported. Other interpreters or shells you will select by terminal menu may not function correctly with this application.
- Infinite loops are not supported, so blocked before being executed.
- AI not responding: Ensure LMStudio is running and accessible at
http://127.0.0.1:1234
, check if model is loaded and its settings, or check LMStudio endpoint undersettings
. - Command not executing: Check if command is in blacklist.
- Sudo commands not allowed: These are generally blocked, run software as root if needed.
- API Console: Available under
debug
menu to monitor LMStudio endpoint.
Contributions are welcome! Please follow these steps:
- Fork the repository.
- Create a new branch for your feature/bugfix.
- Submit a pull request with a detailed explanation of changes.
This project is Open Source. See the LICENSE file for details.
Software is provided as-is. By using it, you accept to take all responsibility for any damage of any kind this software may cause to your data, device(s), firm, corporation, shop, family, friends, whole life, belongings, backyard, dignity, and other moral and psychological stuff, your body or your cats'.