Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError – revert to timm 0.6.13 #215

Closed
WorldofDepth opened this issue May 12, 2023 · 3 comments
Closed

AttributeError – revert to timm 0.6.13 #215

WorldofDepth opened this issue May 12, 2023 · 3 comments

Comments

@WorldofDepth
Copy link

I am using MiDaS v3.1 via Google Colab, and it worked fine a few days ago, but now gives the following error in response to "!python run.py --model_type dpt_beit_large_512 --input_path input --output_path output":

Start processing
  Processing input/8-s.png (1/9)
    Input resized to 512x608 before entering the encoder
Traceback (most recent call last):
  File "/content/MiDaS/run.py", line 276, in <module>
    run(args.input_path, args.output_path, args.model_weights, args.model_type, args.optimize, args.side, args.height,
  File "/content/MiDaS/run.py", line 154, in run
    prediction = process(device, model, model_type, image, (net_w, net_h), original_image_rgb.shape[1::-1],
  File "/content/MiDaS/run.py", line 61, in process
    prediction = model.forward(sample)
  File "/content/MiDaS/midas/dpt_depth.py", line 166, in forward
    return super().forward(x).squeeze(dim=1)
  File "/content/MiDaS/midas/dpt_depth.py", line 114, in forward
    layers = self.forward_transformer(self.pretrained, x)
  File "/content/MiDaS/midas/backbones/beit.py", line 15, in forward_beit
    return forward_adapted_unflatten(pretrained, x, "forward_features")
  File "/content/MiDaS/midas/backbones/utils.py", line 86, in forward_adapted_unflatten
    exec(f"glob = pretrained.model.{function_name}(x)")
  File "<string>", line 1, in <module>
  File "/content/MiDaS/midas/backbones/beit.py", line 125, in beit_forward_features
    x = blk(x, resolution, shared_rel_pos_bias=rel_pos_bias)
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/content/MiDaS/midas/backbones/beit.py", line 102, in block_forward
    x = x + self.drop_path(self.gamma_1 * self.attn(self.norm1(x), resolution,
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1614, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'Block' object has no attribute 'drop_path'. Did you mean: 'drop_path1'?

Does anyone know how this can be fixed? Thank you.

@Jukka-Sun
Copy link

did you find the solution?

@WorldofDepth
Copy link
Author

No—I wish! Any idea?

I tried reverting Python 3.10.11 to 3.8.10, but it seems the torch module is missing in the latter.

@Gasuhu
Copy link

Gasuhu commented May 19, 2023

You just need to use an older version of timm this version worked for me
!pip install timm==0.6.13

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants