Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'Block' object has no attribute 'drop_path' #26

Open
MohamedAliRashad opened this issue Apr 14, 2023 · 5 comments
Open

Comments

@MohamedAliRashad
Copy link

I keep getting the title error, Is there a solution for it ?

@leejaeyong7
Copy link

I've found that the bug is caused by timm version.
If you install older version of timm, it will work.

FYI, I've used timm==0.6.7

@ntakouris
Copy link

That's a problem that MiDaS models have (that carries over to ZoeDepth)

dapperdappy added a commit to dapperdappy/WarpFusion that referenced this issue Jun 10, 2023
This resolves the error `AttributeError: 'Block' object has no attribute 'drop_path` when using the depth controlnet.

Found fix on:
isl-org/ZoeDepth#26
@palol
Copy link

palol commented Aug 9, 2023

I was able to get it to work by reverting to timm=0.6.13. See the original discussion on the MiDaS repo: isl-org/MiDaS#215

@charliememory
Copy link

FYI, you can also modify the source code in beit.py according to timm.models.beit.py without downgrade the timm version, i.e. change the variable names drop_path to drop_path1 and drop_path2
image

@Connah-rs
Copy link

FYI, you can also modify the source code in beit.py according to timm.models.beit.py without downgrade the timm version, i.e. change the variable names drop_path to drop_path1 and drop_path2 image

Following on from @charliememory's comment - you can set the appropriate names on MiDaS fetched from torch.hub with something along the lines of:

def block_forward(self, x, resolution, shared_rel_pos_bias: Optional[torch.Tensor] = None):
    """
    Modification of timm.models.beit.py: Block.forward to support arbitrary window sizes.
    """
    if self.gamma_1 is None:
        x = x + self.drop_path1(self.attn(self.norm1(x), resolution, shared_rel_pos_bias=shared_rel_pos_bias))
        x = x + self.drop_path1(self.mlp(self.norm2(x)))
    else:
        x = x + self.drop_path2(self.gamma_1 * self.attn(self.norm1(x), resolution,
                                                        shared_rel_pos_bias=shared_rel_pos_bias))
        x = x + self.drop_path2(self.gamma_2 * self.mlp(self.norm2(x)))
    return x

midas = torch.hub.load(
    "intel-isl/MiDaS", 'DPT_BEiT_L_384',
    pretrained=True, 
    force_reload=True, trust_repo=True,
)
for block in midas.pretrained.model.blocks:
    block.forward = types.MethodType(block_forward, block)

Works for me with timm==0.9.2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants