Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

BUG: Use Cache class instead of raw tuple for transformers continuous batching, compatible with latest transformers #2820

Conversation

ChengjieLi28
Copy link
Contributor

@ChengjieLi28 ChengjieLi28 commented Feb 8, 2025

Fixes #2818

@XprobeBot XprobeBot added the bug Something isn't working label Feb 8, 2025
@XprobeBot XprobeBot added this to the v1.x milestone Feb 8, 2025
@ChengjieLi28 ChengjieLi28 changed the title BUG: Use Cache class instead of raw tuple for transformers continuous batching, compatible with latest transformers BUG: Use Cache class instead of raw tuple for transformers continuous batching, compatible with latest transformers Feb 8, 2025
Copy link
Contributor

@qinxuye qinxuye left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ChengjieLi28 ChengjieLi28 merged commit ac97a13 into xorbitsai:main Feb 8, 2025
11 of 13 checks passed
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

运行deepseek-r1-distill-qwen出现get_seq_length异常
3 participants