Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pnnx]:torch.clamp_min convert failed #5429

Closed
deepage opened this issue Apr 16, 2024 · 2 comments
Closed

[pnnx]:torch.clamp_min convert failed #5429

deepage opened this issue Apr 16, 2024 · 2 comments

Comments

@deepage
Copy link
Contributor

deepage commented Apr 16, 2024

error log | 日志或报错信息 | ログ

model | 模型 | モデル

  1. 定义模型
import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super(Model, self).__init__()

    def forward(self, x):
        x = torch.clamp_min(x, min=0)
        return  x

model = Model()
model.eval()

input = torch.rand(1, 81, 512)
trace = torch.jit.trace(model, input)
trace.save('clamp_min.torchscript')

how to reproduce | 复现步骤 | 再現方法

  1. pnnx clamp_min.torchscript inputshape=[1,81,512]
  2. 转换时报警信息如下:
fallback batch axis 233 for operand 1
fallback batch axis 233 for operand 2
ignore pnnx.Expression pnnx_expr_0 param expr=0
  1. clamp_min_pnnx.py中前向代码如下:
        v_1 = 0
        v_2 = aten::clamp_min(v_0, v_1)
        return v_2

可以通过修改v_2 = torch.clamp_min(v_0, v_1)得到正确的结果。

@nihui nihui added the bug label May 9, 2024
@nihui
Copy link
Member

nihui commented May 9, 2024

已复现

@nihui
Copy link
Member

nihui commented May 9, 2024

更新torch-2.1以上导出torchscript再试试

@nihui nihui removed the bug label May 9, 2024
@nihui nihui closed this as completed May 9, 2024
This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants