You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would ease the model usage if the weights of large (>1B) models were available in .safetensors.
Motivation
As for now, it takes ~1 hour to download unbabel/*-xxl models on A100. It can be resolved with other formats for model weights.
The safetensors format is faster than the Pytorch ones, as stated here: https://huggingface.co/docs/safetensors/en/speed.
Additional context
It seems that safetensors support would require:
converting the weights for published models (i.e. save_model from safetensors.torch)
support within the classes, for example, new function for loading safetensors in class CometModel
another function similar to load_from_checkpoint
The text was updated successfully, but these errors were encountered:
🚀 Feature
It would ease the model usage if the weights of large (>1B) models were available in
.safetensors
.Motivation
As for now, it takes ~1 hour to download
unbabel/*-xxl
models on A100. It can be resolved with other formats for model weights.The safetensors format is faster than the Pytorch ones, as stated here: https://huggingface.co/docs/safetensors/en/speed.
Additional context
It seems that
safetensors
support would require:save_model
fromsafetensors.torch
)class CometModel
load_from_checkpoint
The text was updated successfully, but these errors were encountered: