Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Add support for Llama and WizardLLM models #18

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

ticklecatisback
Copy link

@ticklecatisback ticklecatisback commented Oct 18, 2024

Fixes #15

Add support for self-hosted generative models Llama and WizardLLM.

  • Add Llama model support:

    • Create addons/copilot/Llama.gd script.
    • Implement _get_models, _set_model, _send_user_prompt functions.
    • Define constants and helper functions for Llama model.
  • Add WizardLLM model support:

    • Create addons/copilot/WizardLLM.gd script.
    • Implement _get_models, _set_model, _send_user_prompt functions.
    • Define constants and helper functions for WizardLLM model.
  • Update Copilot.gd:

    • Modify populate_models function to include Llama and WizardLLM models.
    • Add Llama and WizardLLM to the models dictionary.
  • Update CopilotUI.tscn:

    • Add Llama and WizardLLM nodes to the LLMs node.
    • Set the script for Llama and WizardLLM nodes to the respective new scripts.
    • Add signal connections for Llama and WizardLLM nodes.

For more details, open the Copilot Workspace session.

Fixes minosvasilias#15

Add support for self-hosted generative models Llama and WizardLLM.

* **Add Llama model support:**
  - Create `addons/copilot/Llama.gd` script.
  - Implement `_get_models`, `_set_model`, `_send_user_prompt` functions.
  - Define constants and helper functions for Llama model.

* **Add WizardLLM model support:**
  - Create `addons/copilot/WizardLLM.gd` script.
  - Implement `_get_models`, `_set_model`, `_send_user_prompt` functions.
  - Define constants and helper functions for WizardLLM model.

* **Update Copilot.gd:**
  - Modify `populate_models` function to include Llama and WizardLLM models.
  - Add Llama and WizardLLM to the `models` dictionary.

* **Update CopilotUI.tscn:**
  - Add Llama and WizardLLM nodes to the `LLMs` node.
  - Set the script for Llama and WizardLLM nodes to the respective new scripts.
  - Add signal connections for Llama and WizardLLM nodes.

---

For more details, open the [Copilot Workspace session](https://copilot-workspace.githubnext.com/minosvasilias/godot-copilot/issues/15?shareId=XXXX-XXXX-XXXX-XXXX).
Add `llama3.2` model option to copilot settings.

* Modify `addons/copilot/Llama.gd` to include `llama3.2` in the `_get_models` function.
* Update `addons/copilot/CopilotUI.tscn` to add `llama3.2` to the `Model` OptionButton.

---

For more details, open the [Copilot Workspace session](https://copilot-workspace.githubnext.com/ticklecatisback/godot-copilot?shareId=XXXX-XXXX-XXXX-XXXX).
Add llama3.2 model to copilot settings
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

It would be cool if one could access self-hosted generative models (Llama, WizardLLM, etc.)
1 participant