-
Notifications
You must be signed in to change notification settings - Fork 11.6k
Error: Invalid model file when using converted GPT4ALL model after following provided instructions #655
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
I could run it with the previous version https://github.com/ggerganov/llama.cpp/tree/master-ed3c680 |
After building from this tag, I'm getting a segfault. What OS are you using?
|
|
I solved the issue by running the command:
after executing the:
and now i'm interacting with gpt4all with:
|
Would it be worth updating the README section with this information? |
After running convert-gpt4all-to-ggml.py and migrate-ggml-2023-03-30-pr613.py, main segfaults with a failed ggml assertion.
Full logs:
|
These are all steps that I did:
However it is writing nonsenses and does not let to interact with interactive mode. Maybe something is wrong. |
I commented out this line in ggml.c and recompiled to see what would happen, and it just worked. That was unexpected, but I won't complain. |
Can anyone confirm if |
This is strange. It's expected that it works after commenting this line since we don't really need the buffer to be aligned, but I wonder why it is not the case anymore. Seems to be related to the |
It happened to me when trying to use |
Hello,
I have followed the instructions provided for using the GPT-4ALL model. I used the
convert-gpt4all-to-ggml.py
script to convert thegpt4all-lora-quantized.bin
model, as instructed. However, I encountered an error related to an invalid model file when running the example.Here are the steps I followed, as described in the instructions:
convert-gpt4all-to-ggml.py
script:interactive mode
example with the newly generatedgpt4all-lora-quantized.bin
model:However, I encountered the following error:
Please let me know how to resolve this issue and correctly convert and use the GPT-4ALL model with the
interactive mode
example.Thank you.
The text was updated successfully, but these errors were encountered: