You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all, thank you for putting such an easy to use implementation on GitHub.
I'm trying to incorporate the nystrom attention into a legacy codebase, it previously used to provide the input X and the mask (off the same dimensions as X) to a Multi headed Attention Layer.
When I'm trying to integrate nystrom attention with it, it runs alright without the mask.
But, when I pass the mask alongside it, it throws einops rearrange error.
Sorry, if this is a very basic question, but how would you recommend I deal with handling 3D mask (same dimensions as the size of input) in the codebase.
Best,
VB
The text was updated successfully, but these errors were encountered:
Hi,
First of all, thank you for putting such an easy to use implementation on GitHub.
I'm trying to incorporate the nystrom attention into a legacy codebase, it previously used to provide the input X and the mask (off the same dimensions as X) to a Multi headed Attention Layer.
When I'm trying to integrate nystrom attention with it, it runs alright without the mask.
But, when I pass the mask alongside it, it throws einops rearrange error.
Sorry, if this is a very basic question, but how would you recommend I deal with handling 3D mask (same dimensions as the size of input) in the codebase.
Best,
VB
The text was updated successfully, but these errors were encountered: