Skip to content

Releases: Dao-AILab/flash-attention

v2.5.8

v2.5.7

08 Apr 03:15
Compare
Choose a tag to compare
Bump to v2.5.7

v2.5.6

02 Mar 06:17
Compare
Choose a tag to compare
Bump to v2.5.6

v2.5.5

21 Feb 23:59
Compare
Choose a tag to compare
Bump to v2.5.5

v2.5.4

21 Feb 00:32
Compare
Choose a tag to compare
Bump to v2.5.4

v2.5.3

10 Feb 09:09
Compare
Choose a tag to compare
Bump to v2.5.3

v2.5.2

31 Jan 10:46
Compare
Choose a tag to compare
Bump to v2.5.2

v2.5.1.post1

30 Jan 22:34
Compare
Choose a tag to compare
[CI] Install torch 2.3 using index

v2.5.1

30 Jan 05:07
Compare
Choose a tag to compare
Bump to v2.5.1

v2.5.0

23 Jan 07:41
Compare
Choose a tag to compare
Bump to v2.5.0