You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In PyTorch's official nn.TransformerEncoder, there is a parameter called src_key_padding_mask, which represents the mask for source data keys in each batch (optional). Does the x_transformers library offer a similar optional masking method, specifically designed to mask only the keys?
In PyTorch's official
nn.TransformerEncoder
, there is a parameter calledsrc_key_padding_mask
, which represents the mask for source data keys in each batch (optional). Does thex_transformers
library offer a similar optional masking method, specifically designed to mask only the keys?I have defined the network structure above,then I want to use as:
Which mask should I use?
The text was updated successfully, but these errors were encountered: