forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 49
Issues: ROCm/flash-attention
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Issue]: Test failing with ROCm 6.3.1 on MI250X
Under Investigation
#120
opened Jan 29, 2025 by
al-rigazzi
Issue: GFX1151 fails to build. Undeclared identifier 'CK_TILE_BUFFER_RESOURCE_3RD_DWORD'
Under Investigation
#111
opened Dec 17, 2024 by
kiram9
[Issue]: is scaled_dot_product_attention part of flash attention?
#79
opened Sep 2, 2024 by
unclemusclez
[Feature]: Flash Attention 3 Support for MI300X GPUs
Feature Request
Under Investigation
#71
opened Aug 1, 2024 by
codinggosu
[Feature]: Support for newer flash-attention versions (e.g. ≥2.1.0)
Feature Request
Under Investigation
#53
opened May 22, 2024 by
JiahuaZhao
[Issue]: Expected dout_seq_stride == out_seq_stride to be true, but got false
#40
opened Jan 30, 2024 by
ehartford
Merge to upstream flash-attention repo
Under Investigation
upstream
#35
opened Jan 18, 2024 by
ehartford
replace kernel implementation using CK tile-programming performant kernels
#33
opened Jan 10, 2024 by
carlushuang
4 tasks
Feature request: Sliding Window Attention
Feature Request
function
#22
opened Nov 29, 2023 by
tjtanaa
ProTip!
Updated in the last three days: updated:>2025-02-07.