Skip to content

Releases: bdashore3/flash-attention

v2.4.1

25 Dec 06:26
Compare
Choose a tag to compare
Add Windows workflows

2.3.3-windows

18 Nov 23:37
Compare
Choose a tag to compare

In parity with the original tag

Built with Pytorch 2.1.1 and CUDA 12.2. This wheel will work with pytorch 2.1+ and CUDA 12+

Full Changelog: https://github.com/bdashore3/flash-attention/commits/2.3.3

2.3.2-windows

09 Oct 02:55
Compare
Choose a tag to compare

Cuda 12.1 only. Please see The original repo for more information

2.3.2-2-windows

14 Oct 16:26
Compare
Choose a tag to compare
2.3.2-2-windows Pre-release
Pre-release

Update wheels to cuda 12.2 and 12.1 versions. The 12.2 wheel is backwards compatible with 12.1 (on my 3090ti system).

2.3.2-1-windows

14 Oct 03:41
Compare
Choose a tag to compare
2.3.2-1-windows Pre-release
Pre-release

Tests "unified" wheel