Skip to content

feature: linearly interpolating one or multiple LoRA with base model #905

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Closed
jon-chuang opened this issue Apr 12, 2023 · 3 comments
Closed
Labels

Comments

@jon-chuang
Copy link
Contributor

Once #820 is merged, it would be nice to allow linearly interpolating one or multiple LoRA.

LoRA should be able to be loaded interactively, and interpolation weights also adjusted interactively.

@ghost
Copy link

ghost commented Jun 13, 2023

@jon-chuang what seems to the blocker currently to make lora adapters swappable?

@yacineMTB
Copy link

yacineMTB commented Jun 13, 2023

https://github.com/ggerganov/llama.cpp/blob/master/llama.cpp#L2838-L2876

Proposal: Just Prototyping interactive LoRA and showing that it might be valuable
Shouldn't be too hard!

Think of it as sharing a python notebook that uses torch, but instead of torch, ggml

@github-actions github-actions bot added the stale label Mar 25, 2024
Copy link
Contributor

This issue was closed because it has been inactive for 14 days since being marked as stale.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants