Skip to content

Commit

Permalink
[bugfix] Random attention / constant mask over batch (facebookresearc…
Browse files Browse the repository at this point in the history
  • Loading branch information
blefaudeux committed Apr 16, 2021
1 parent f481e9f commit 4106b4f
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion xformers/components/attention/random.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,10 @@ def __init__(
self.constant_masking = constant_masking

def _get_rand_mask(self, shape: torch.Size) -> torch.Tensor:
mask = torch.FloatTensor(shape[0], shape[1], shape[1]).uniform_() < self.r
mask = torch.FloatTensor(shape[1], shape[1]).uniform_() < self.r
mask = mask.unsqueeze(0).expand(
shape[0], -1, -1
) # duplicate the mask over the batch dimension

# Sparsity threshold, below that having a sparse matrix is more efficient
if self.r < _SPARSITY_THRESHOLD:
Expand Down

0 comments on commit 4106b4f

Please sign in to comment.