You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please bear with my ignorance, as I'm largely new to working with ML models.
I'm interested in adopting and utilizing DuckDB-NSQL, but I'm struggling just to get it installed. The instructions say to install requirements.txt, but one of the requirements, flash-attn is particularly finicky. It doesn't support installation from wheels and has implicit build dependencies on pytorch and CUDA. The best guidance on installation suggests to use the Nvidia image to build, and use of that image or CUDA seems to imply having an Nvidia GPU. Even after I got CUDA_HOME set, I struggled to get flash-attn to install (still complaining about CUDA_HOME not being set).
Does that mean that DuckDB-NSQL depends on Nvidia GPUs for both training and inference? Is it possible to do the inference on CPU? Would you consider providing step-by-step instructions on how to run DuckDB-NSQL on generic hardware for inference, or at least to point to some references I could follow to develop the foundations necessary to do so?
The text was updated successfully, but these errors were encountered:
Please bear with my ignorance, as I'm largely new to working with ML models.
I'm interested in adopting and utilizing DuckDB-NSQL, but I'm struggling just to get it installed. The instructions say to install requirements.txt, but one of the requirements,
flash-attn
is particularly finicky. It doesn't support installation from wheels and has implicit build dependencies on pytorch and CUDA. The best guidance on installation suggests to use the Nvidia image to build, and use of that image or CUDA seems to imply having an Nvidia GPU. Even after I got CUDA_HOME set, I struggled to get flash-attn to install (still complaining about CUDA_HOME not being set).Does that mean that DuckDB-NSQL depends on Nvidia GPUs for both training and inference? Is it possible to do the inference on CPU? Would you consider providing step-by-step instructions on how to run DuckDB-NSQL on generic hardware for inference, or at least to point to some references I could follow to develop the foundations necessary to do so?
The text was updated successfully, but these errors were encountered: