Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Safetensors Support #212

Open
upunaprosk opened this issue Apr 13, 2024 · 0 comments
Open

Safetensors Support #212

upunaprosk opened this issue Apr 13, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@upunaprosk
Copy link

🚀 Feature

It would ease the model usage if the weights of large (>1B) models were available in .safetensors.

Motivation

As for now, it takes ~1 hour to download unbabel/*-xxl models on A100. It can be resolved with other formats for model weights.
The safetensors format is faster than the Pytorch ones, as stated here: https://huggingface.co/docs/safetensors/en/speed.

Additional context

It seems that safetensors support would require:

  • converting the weights for published models (i.e. save_model from safetensors.torch)
  • support within the classes, for example, new function for loading safetensors in class CometModel
  • another function similar to load_from_checkpoint
@upunaprosk upunaprosk added the enhancement New feature or request label Apr 13, 2024
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant