You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
./llama-cli --version
ggml_vulkan: Found 1 Vulkan devices:
ggml_vulkan: 0 = AMD Radeon Graphics (RADV RENOIR) (radv) | uma: 1 | fp16: 1 | warp size: 64 | shared memory: 65536 | matrix cores: none
version: 4882 (be7c303)
built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-cli (Vulkan Linux x86 build)
Command line
./llama-cli -hf <any model>
common_get_hf_file: llama.cpp built without libcurl, downloading from Hugging Face not supported.
Problem description & steps to reproduce
I extracted llama-b4882-bin-ubuntu-vulkan-x64.zip and was unable to download models from Hugging Face due to missing curl support. Could you please build future packages with libcurl support ?
./llama-cli -hf
common_get_hf_file: llama.cpp built without libcurl, downloading from Hugging Face not supported.
Workaround:
Install llama-b4914-bin-ubuntu-x64.zip which includes libcurl support.
The text was updated successfully, but these errors were encountered:
Name and Version
./llama-cli --version
ggml_vulkan: Found 1 Vulkan devices:
ggml_vulkan: 0 = AMD Radeon Graphics (RADV RENOIR) (radv) | uma: 1 | fp16: 1 | warp size: 64 | shared memory: 65536 | matrix cores: none
version: 4882 (be7c303)
built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-cli (Vulkan Linux x86 build)
Command line
Problem description & steps to reproduce
I extracted llama-b4882-bin-ubuntu-vulkan-x64.zip and was unable to download models from Hugging Face due to missing curl support. Could you please build future packages with libcurl support ?
./llama-cli -hf
common_get_hf_file: llama.cpp built without libcurl, downloading from Hugging Face not supported.
Workaround:
Install llama-b4914-bin-ubuntu-x64.zip which includes libcurl support.
The text was updated successfully, but these errors were encountered: