Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only works if pixel_size**2 == patch_size? #3

Closed
PhilippMarquardt opened this issue Mar 4, 2021 · 1 comment
Closed

Only works if pixel_size**2 == patch_size? #3

PhilippMarquardt opened this issue Mar 4, 2021 · 1 comment

Comments

@PhilippMarquardt
Copy link

Hi,
is this only supposed to work if

pixel_size**2 == patch_size 

?. When setting the patch_size to any number that doesn't fulfill the equation this error occurs:

--> 146         pixels += rearrange(self.pixel_pos_emb, 'n d -> () n d')
    147 
    148         for pixel_attn, pixel_ff, pixel_to_patch_residual, patch_attn, patch_ff in self.layers:

RuntimeError: The size of tensor a (4) must match the size of tensor b (64) at non-singleton dimension 1

The error came when running

tnt = TNT(
    image_size = 128,       # size of image
    patch_dim = 256,        # dimension of patch token
    pixel_dim = 24,         # dimension of pixel token
    patch_size = 16,        # patch size
    pixel_size = 2,         # pixel size
    depth = 6,              # depth
    heads = 1,
    num_classes = 2,     # output number of classes
    attn_dropout = 0.1,     # attention dropout
    ff_dropout = 0.1        # feedforward dropout,
)
img = torch.randn(2, 3, 128, 128)
logits = tnt(img)

Since I am completely new to einops its quite hard for me to debug :D Thanks

@PhilippMarquardt PhilippMarquardt changed the title Changing pixel size to something besides 4 causes error Only works if pixel_size**2 == patch_size? Mar 4, 2021
@lucidrains
Copy link
Owner

@PhilippMarquardt turns out I had some 🐛 🪲 lol

fixed in the latest 7621ce6 !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants