-
Notifications
You must be signed in to change notification settings - Fork 11.5k
[FIXED] I wrote a script to merge lora thanks to slaren its done #1516
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
Using |
It's not working in my env? As I wrote in the last lines... "llamaforcusallm" no attribute "merge_and_unload" |
So wired, I try it again and it work.... |
Do you mind I creat a PR with it? |
May I ask, how can I use the model after merging? I got |
Sorry I didn't have any idea about GGUF now.... Might be able to convert it to ggml and then convert to gguf? |
merge.py
it seems work. may stuck in embedding step? No clue now.
but while using convert.py
File "convert.py", line 1168, in
main()
File "onvert.py", line 1148, in main
model_plus = load_some_model(args.model)
File "convert.py", line 1076, in load_some_model
model_plus = merge_multifile_models(models_plus)
File "convert.py", line 583, in merge_multifile_models
model = merge_sharded([mp.model for mp in models_plus])
File "convert.py", line 562, in merge_sharded
return {name: convert(name) for name in names}
File "convert.py", line 562, in
return {name: convert(name) for name in names}
File "convert.py", line 537, in convert
lazy_tensors: List[LazyTensor] = [model[name] for model in models]
File "convert.py", line 537, in
lazy_tensors: List[LazyTensor] = [model[name] for model in models]
KeyError: 'embed_tokens.weight'
can someone try that script?
The text was updated successfully, but these errors were encountered: