Skip to content

Releases: Dao-AILab/flash-attention

v2.4.3.post1

22 Jan 01:24
Compare
Choose a tag to compare
[CI] Fix CUDA 12.2.2 compilation

v2.4.3

22 Jan 01:15
Compare
Choose a tag to compare
Bump to v2.4.3

v2.4.2

26 Dec 00:29
Compare
Choose a tag to compare
Bump to v2.4.2

v2.4.1

24 Dec 05:01
Compare
Choose a tag to compare
Bump to v2.4.1

v2.4.0.post1

22 Dec 18:10
Compare
Choose a tag to compare
[CI] Don't compile for python 3.7 pytorch 2.2

v2.4.0

22 Dec 08:10
Compare
Choose a tag to compare
Bump to v2.4.0

v2.3.6

28 Nov 00:24
Compare
Choose a tag to compare
Bump to v2.3.6

v2.3.5

27 Nov 03:09
Compare
Choose a tag to compare
Bump to v2.3.5

v2.3.4

20 Nov 07:22
Compare
Choose a tag to compare
Bump to v2.3.4

v2.3.3

24 Oct 07:24
Compare
Choose a tag to compare
Bump to v2.3.3