-
Notifications
You must be signed in to change notification settings - Fork 309
Issues: InternLM/xtuner
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
command error: ''Adafactor is already registered in optimizer at torch.optim''
#957
opened Oct 28, 2024 by
monteir03
微调基于 InternLM2-7B 的模型时错误:TypeError: Linear4bit.forward() takes 2 positional arguments but 3 were given
#941
opened Oct 5, 2024 by
AFObject
When seq_parallel_world_size is set to a value greater than 1, should use_varlen_attn not be set to true?
#938
opened Sep 27, 2024 by
Fovercon
AttributeError: 'Qwen2FlashAttention2' object has no attribute '_flash_attention_forward'
#935
opened Sep 24, 2024 by
zhangyuqi-1
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.