Releases: bdashore3/flash-attention
Releases Β· bdashore3/flash-attention
v2.5.9.post1
Actions: Clarify dispatch formatting Signed-off-by: kingbri <bdashore3@proton.me>
v2.5.8
Same as Upstream tag
Now built for only torch 2.2.2 and 2.3.0
v2.5.6
Same as Upstream tag
v2.5.2
Same as the upstream tag
Adds this PR to help fix building on Windows
v2.4.2
Inline with the parent repo's tag
Made for cuda 12.x and pytorch 2.1.2 and 2.2
v2.4.3 and up cannot be built on Windows at this time.
v2.4.1
Add Windows workflows
2.3.3-windows
In parity with the original tag
Built with Pytorch 2.1.1 and CUDA 12.2. This wheel will work with pytorch 2.1+ and CUDA 12+
Full Changelog: https://github.com/bdashore3/flash-attention/commits/2.3.3
2.3.2-windows
Cuda 12.1 only. Please see The original repo for more information
2.3.2-2-windows
Update wheels to cuda 12.2 and 12.1 versions. The 12.2 wheel is backwards compatible with 12.1 (on my 3090ti system).
2.3.2-1-windows
Tests "unified" wheel