Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

支持72B的instruct模型吗? #4

Open
mxzgn opened this issue Jun 12, 2024 · 1 comment
Open

支持72B的instruct模型吗? #4

mxzgn opened this issue Jun 12, 2024 · 1 comment

Comments

@mxzgn
Copy link

mxzgn commented Jun 12, 2024

目前72模型在
ov_model = OVModelForCausalLM.from_pretrained(model_path, export=True,
compile=False, quantization_config=OVWeightQuantizationConfig(
bits=4, **compression_configs))
中断了,请问如何解决

@openvino-dev-samples
Copy link
Owner

请问你的系统内存有多大?量化72B模型可能需要256GB以上内存

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants