Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

AttributeError: 'LlamaRotaryEmbedding' object has no attribute 'cos_cached' #168

Closed
Ronnie-Leon76 opened this issue Feb 12, 2024 · 12 comments
Labels
fixed - pending confirmation Fixed, waiting for confirmation from poster

Comments

@Ronnie-Leon76
Copy link

image
I'm trying to fine-tune unsloth/yi-6b-bnb-4bit on a custom dataset but as soon I initiate the training using trainer.train wrapped in an learning_rate scheduler optimizer using CosineAnnealing learning rate scheduler I get the error shown in the image. The error "AttributeError: 'LlamaRotaryEmbedding' object has no attribute 'cos_cached'" seems to be related to the LLaMA model of the Transformers library. It appears that the issue is with the implementation of the rotary embedding, specifically in the rotate_half function, where the slicing should have been interleaved. I'll appreciate help solving the issue above.

@danielhanchen
Copy link
Contributor

@Ronnie-Leon76 I'll check this out today!! Sorry on the issue!

@Ronnie-Leon76
Copy link
Author

I'll really appreciate it

@hbernie
Copy link

hbernie commented Feb 14, 2024

Having the same issue with unsloth/llama-2-7b-chat-bnb-4bit

@danielhanchen
Copy link
Contributor

@Ronnie-Leon76 @hbernie Apologies could not look into it yesterday - was a bit inundated with stuff :(( Will 100% do it today! :) Sorry again!

@Ronnie-Leon76
Copy link
Author

Ronnie-Leon76 commented Feb 14, 2024

It's okay. Is it something I can help with? Could you give me a high level breakdown of what needs to be done to fix the issue? Should we refine the rotate_half() method?

@danielhanchen
Copy link
Contributor

@Ronnie-Leon76 @hbernie
I found the issue! A new push a few days ago was made to the latest HuggingFace transformers branch huggingface/transformers#27931 which broke it!!

I'm assuming you used the HF notebooks which we shared - for now I would comment out !pip install "git+https://github.com/huggingface/transformers.git" - we will install the correct version anyways.

Also if you're on a local PC, I would downgrade transformers via !pip install transformers==4.37.2

I will for now edit my notebooks to remove the lines - hope the temporary fix solves it!

@danielhanchen
Copy link
Contributor

@Ronnie-Leon76 @hbernie I think I fixed it!! Hope you all can try it out :) I also updated all the notebooks on our HuggingFace branch https://huggingface.co/datasets/unsloth/notebooks/tree/main and on our blog posts.

No need to change your notebooks!
If you're using a local machine (Colab no need to), upgrade Unsloth via pip install --upgrade --force-reinstall --no-cache-dir git+https://github.com/unslothai/unsloth.git

Hope it works!

@danielhanchen danielhanchen added the fixed - pending confirmation Fixed, waiting for confirmation from poster label Feb 14, 2024
@Ronnie-Leon76
Copy link
Author

@danielhanchen It works fine. Thanks a lot.

@spydaz
Copy link

spydaz commented Jul 18, 2024

AttributeError: 'MistralRotaryEmbedding' object has no attribute 'cos_cached'

came back

@danielhanchen
Copy link
Contributor

@spydaz Best to uninstall then reinstall Unsloth

@SnehaKumari14
Copy link

i am facing the same issue , Can some help. I tried uninstalling unsloth the install . still

@danielhanchen
Copy link
Contributor

@SnehaKumari14 So

pip uninstall unsloth -y
pip install --upgrade --no-cache-dir "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"

Does not work? :(

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
fixed - pending confirmation Fixed, waiting for confirmation from poster
Projects
None yet
Development

No branches or pull requests

5 participants