Skip to content

Releases: lucidrains/alphafold2

0.0.21

15 Feb 23:41
Compare
Choose a tag to compare
add setting to row-tie attention for MSA, but with a strict warning t…

…hat the batches of MSA contains no padding or masked out tokens

0.0.20

15 Feb 22:49
Compare
Choose a tag to compare
use axial attention for MSA processing, inline with paper released tw…

…o days ago by @rmrao , the MSA transformer

0.0.19

11 Feb 19:57
Compare
Choose a tag to compare
add ability to toggle inter MSA self attention

0.0.18

11 Feb 18:52
Compare
Choose a tag to compare
add an assert to make sure total sequence length of MSA or AA does no…

…t exceed the set length of the sparse attention kernel

0.0.17

11 Feb 18:18
Compare
Choose a tag to compare
make all masking optional

0.0.16

10 Feb 00:32
Compare
Choose a tag to compare
add memory compressed attention to make cross attention more efficient

0.0.11

04 Feb 19:56
Compare
Choose a tag to compare
0.0.11

0.0.10

04 Feb 18:25
Compare
Choose a tag to compare
complete reversibility

0.0.9

03 Feb 18:11
66e1289
Compare
Choose a tag to compare
bump

0.0.8

03 Feb 16:55
Compare
Choose a tag to compare
more cleanup