Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.nn.TransformerEncoderLayer missing exception description information. #89394

Open
triumph-wangyuyang opened this issue Nov 21, 2022 · 2 comments
Labels
module: error checking Bugs related to incorrect/lacking error checking module: nn Related to torch.nn oncall: transformer/mha Issues related to Transformers and MultiheadAttention triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@triumph-wangyuyang
Copy link

triumph-wangyuyang commented Nov 21, 2022

馃悰 Describe the bug

import torch
encoder_layer = torch.nn.TransformerEncoderLayer(d_model=52, nhead=1)
src = torch.rand(10, 32, 512)
out = encoder_layer(src)

When running this code, an AssertionError is thrown directly without giving effective information about the exception. The real error is that d_model is not equal to the third value in src. It is recommended to use the pytorch1.8.0 version. When this happens, throw Display the corresponding exception information.

Versions

pytorch: 1.8.0
Python version: 3.8
CUDA/cuDNN version: cuDNN 11.1
GPU models and configuration: RTX3060
Operating System锛歐indows

cc @albanD @mruberry @jbschlosser @walterddr @kshitij12345 @saketh-are @bhosmer @cpuhrsch @erichan1

@samdow
Copy link
Contributor

samdow commented Nov 21, 2022

Are you strongly attached to 1.8.0? On 1.13, it gives a more informative error:

  File "/Users/samdow/anaconda3/envs/1.13/lib/python3.10/site-packages/torch/nn/functional.py", line 5046, in multi_head_attention_forward
    assert embed_dim == embed_dim_to_check, \
AssertionError: was expecting embedding dimension of 52, but got 512

since d_model is the number of expected features of the input according to the docs, it and the final dim of src must match

@samdow samdow added module: nn Related to torch.nn triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Nov 21, 2022
@triumph-wangyuyang
Copy link
Author

TransformerEncoderLayer

Currently doing a test experiment, need to use pytorch1.8 version, and then use TransformerEncoderLayer, this problem is triggered, I feel very confused, I will test with a higher version later, and found that the higher version has provided the corresponding information description, thank you .

@mruberry mruberry added oncall: transformer/mha Issues related to Transformers and MultiheadAttention module: error checking Bugs related to incorrect/lacking error checking labels Nov 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: error checking Bugs related to incorrect/lacking error checking module: nn Related to torch.nn oncall: transformer/mha Issues related to Transformers and MultiheadAttention triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

3 participants