Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[Bug]: AttributeError: 'Qwen2Model' object has no attribute 'rotary_emb' #10773

Open
1 task done
Alex-DeepL opened this issue Nov 29, 2024 · 5 comments
Open
1 task done
Labels
bug Something isn't working

Comments

@Alex-DeepL
Copy link

Your current environment

The output of `python collect_env.py`
Your output of `python collect_env.py` here

Model Input Dumps

from awq import AutoAWQForCausalLM
from transformers import AutoTokenizer

model_path = '/home/root123/workspace/model/qwen2-0-5/'

quant_path = '/home/root123/workspace/model/qwen2-0-5-awq-4/'

quant_config = { "zero_point": True, "q_group_size": 128, "w_bit": 4, "version": "GEMM" }

Load model

model = AutoAWQForCausalLM.from_pretrained(
model_path, **{"low_cpu_mem_usage": True, "use_cache": False}
)
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)

Quantize

model.quantize(tokenizer, quant_config=quant_config)

Save quantized model

model.save_quantized(quant_path)
tokenizer.save_pretrained(quant_path)

print(f'Model is quantized at "{quant_path}"')

🐛 Describe the bug

在进行awq模型量化时报错

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@Alex-DeepL Alex-DeepL added the bug Something isn't working label Nov 29, 2024
@DaBossCoda
Copy link

im getting same thing but i think this is a bug AutoAwq

@Alex-DeepL
Copy link
Author

Alex-DeepL commented Dec 2, 2024 via email

@ArlanCooper
Copy link

+1

@casper-hansen
Copy link
Contributor

casper-hansen commented Dec 5, 2024

This is a bug in the weight saving method provided by Huggingface. This is being tracked in Transformers below and will be fixed in the future - until then, recommend to use v0.2.6 of AutoAWQ or patch your model like this: casper-hansen/AutoAWQ#665 (comment)

huggingface/transformers#35080

@casper-hansen
Copy link
Contributor

This issue can now be closed as it has been resolved upstream in huggingface_hub and now autoawq.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants