-
Notifications
You must be signed in to change notification settings - Fork 11.5k
Is there a requirements.txt ? #8
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
This doesn't use Python dependencies |
@sonu27 It does need some pythn for the model processing with convert-pth-to-ggml.py I had some issues installing torch, but managed to install it using the nighly build:
|
@holstvoogd ah understood, according to this you need Python 3.10, due to the lack of a torch wheel for 3.11 - https://til.simonwillison.net/llms/llama-7b-m2 |
Ah, that explains why I had issues :) |
SlyEcho
pushed a commit
to SlyEcho/llama.cpp
that referenced
this issue
Jun 1, 2023
Change how the token buffers work.
cebtenzzre
added a commit
to cebtenzzre/llama.cpp
that referenced
this issue
Nov 7, 2023
4 tasks
# for free
to join this conversation on GitHub.
Already have an account?
# to comment
No description provided.
The text was updated successfully, but these errors were encountered: