Skip to content

Commit

Permalink
use stable softmax in attention
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Aug 9, 2021
1 parent e6ed6f4 commit b88e9e3
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 1 deletion.
1 change: 1 addition & 0 deletions alphafold2_pytorch/alphafold2.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,7 @@ def forward(self, x, mask = None, attn_bias = None, context = None, context_mask

# attention

dots = dots - dots.max(dim = -1, keepdims = True).values
attn = dots.softmax(dim = -1)
attn = self.dropout(attn)

Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
setup(
name = 'alphafold2-pytorch',
packages = find_packages(),
version = '0.4.23',
version = '0.4.24',
license='MIT',
description = 'AlphaFold2 - Pytorch',
author = 'Phil Wang, Eric Alcaide',
Expand Down

0 comments on commit b88e9e3

Please sign in to comment.