Skip to content

Releases: Dao-AILab/flash-attention

v2.6.3

v2.6.2

23 Jul 09:30
Compare
Choose a tag to compare
Bump to v2.6.2

v2.6.1

11 Jul 15:29
Compare
Choose a tag to compare
Bump to v2.6.1

v2.6.0.post1

11 Jul 09:55
Compare
Choose a tag to compare
[CI] Compile with pytorch 2.4.0.dev20240514

v2.6.0

11 Jul 04:35
Compare
Choose a tag to compare
Bump v2.6.0

v2.5.9.post1

26 May 22:36
Compare
Choose a tag to compare
Limit to MAX_JOBS=1 with CUDA 12.2

v2.5.9

26 May 21:02
Compare
Choose a tag to compare
Bump to 2.5.9

v2.5.8

26 Apr 17:55
Compare
Choose a tag to compare
Bump to v2.5.8

v2.5.7

08 Apr 03:15
Compare
Choose a tag to compare
Bump to v2.5.7

v2.5.6

02 Mar 06:17
Compare
Choose a tag to compare
Bump to v2.5.6