Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

llama-2-13b的模型用单卡跑lora就会报错 #95

Open
Enoch202 opened this issue Oct 24, 2023 · 1 comment
Open

llama-2-13b的模型用单卡跑lora就会报错 #95

Enoch202 opened this issue Oct 24, 2023 · 1 comment

Comments

@Enoch202
Copy link

用单卡跑就会显示“NotImplementedError: Cannot copy out of meta tensor; no data!”
用双卡跑就正常,但是占用显存很大。两张48G的A6000,每张都会占用35G左右

@Enoch202 Enoch202 changed the title 跑llama-13b的模型用单卡跑lora就会报错 跑llama-2-13b的模型用单卡跑lora就会报错 Oct 24, 2023
@Enoch202 Enoch202 changed the title 跑llama-2-13b的模型用单卡跑lora就会报错 llama-2-13b的模型用单卡跑lora就会报错 Oct 24, 2023
@jianzhnie
Copy link
Owner

试一下用微软的 deepspeed 呢

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants