Releases: bdashore3/flash-attention
Releases Β· bdashore3/flash-attention
v2.4.1
Add Windows workflows
2.3.3-windows
In parity with the original tag
Built with Pytorch 2.1.1 and CUDA 12.2. This wheel will work with pytorch 2.1+ and CUDA 12+
Full Changelog: https://github.com/bdashore3/flash-attention/commits/2.3.3
2.3.2-windows
Cuda 12.1 only. Please see The original repo for more information
2.3.2-2-windows
Update wheels to cuda 12.2 and 12.1 versions. The 12.2 wheel is backwards compatible with 12.1 (on my 3090ti system).
2.3.2-1-windows
Tests "unified" wheel