Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Fix the type of paged_kv_cache in append #597

Merged
merged 1 commit into from
Nov 8, 2024

Conversation

nandor
Copy link
Contributor

@nandor nandor commented Nov 8, 2024

The type is adjusted to be consistent with the prefill/decode wrappers.

The type is adjusted to be consistent with the prefill/decode wrappers.
@zhyncs zhyncs merged commit f5621d3 into flashinfer-ai:main Nov 8, 2024
yzh119 added a commit that referenced this pull request Nov 8, 2024
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants