We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
目前72模型在 ov_model = OVModelForCausalLM.from_pretrained(model_path, export=True, compile=False, quantization_config=OVWeightQuantizationConfig( bits=4, **compression_configs)) 中断了,请问如何解决
The text was updated successfully, but these errors were encountered:
请问你的系统内存有多大?量化72B模型可能需要256GB以上内存
Sorry, something went wrong.
No branches or pull requests
目前72模型在
ov_model = OVModelForCausalLM.from_pretrained(model_path, export=True,
compile=False, quantization_config=OVWeightQuantizationConfig(
bits=4, **compression_configs))
中断了,请问如何解决
The text was updated successfully, but these errors were encountered: