Skip to content

Commit

Permalink
unnecessary, pytorch native softmax is numerically stable
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Jul 5, 2022
1 parent d59cb1e commit 931466e
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
1 change: 0 additions & 1 deletion alphafold2_pytorch/alphafold2.py
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,6 @@ def forward(self, x, mask = None, attn_bias = None, context = None, context_mask

# attention

dots = dots - dots.max(dim = -1, keepdims = True).values
attn = dots.softmax(dim = -1)
attn = self.dropout(attn)

Expand Down
3 changes: 2 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,10 @@
setup(
name = 'alphafold2-pytorch',
packages = find_packages(),
version = '0.4.31',
version = '0.4.32',
license='MIT',
description = 'AlphaFold2 - Pytorch',
long_description_content_type = 'text/markdown',
author = 'Phil Wang, Eric Alcaide',
author_email = 'lucidrains@gmail.com, ericalcaide1@gmail.com',
url = 'https://github.com/lucidrains/alphafold2',
Expand Down

0 comments on commit 931466e

Please sign in to comment.