Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Resize simplifier behavior should be change #34

Closed
lucasjinreal opened this issue Nov 25, 2019 · 13 comments
Closed

[BUG] Resize simplifier behavior should be change #34

lucasjinreal opened this issue Nov 25, 2019 · 13 comments

Comments

@lucasjinreal
Copy link

Now the model simplified like this:

image

But, we need convert to tensorrt, and when convert to trt, we gots:

----------------------------------------------------------------
Input filename:   model_upsample_sim.onnx
ONNX IR version:  0.0.4
Opset version:    11
Producer name:    pytorch
Producer version: 1.3
Domain:           
Model version:    0
Doc string:       
----------------------------------------------------------------
Parsing model


onnx-tensorrt/ModelImporter.cpp:98 In function importInputs:
[8] Assertion failed: convert_onnx_weights(initializer, &weights)

@lucasjinreal
Copy link
Author

this can call python3 -m onnxsim to reproduce by:

class TinyModel(nn.Module):
    def __init__(self):
        super(TinyModel, self).__init__()
        self.expander = nn.Conv2d(3, 192, 1, 1)

        # upsample cause Gather error
        self.P4_upsampled = nn.Upsample(scale_factor=2, mode='nearest')

    def forward(self, x):
        x = self.expander(x)
        # a = self.P4_upsampled(x)  

        sh = torch.tensor(x.shape[-2:])
        print(sh)
        a = F.interpolate(x, (sh[0]*2, sh[1]*2))
        return a
        

def export_onnx():
    model = TinyModel().to(device)
    sample_input = torch.rand(1, 3, 544, 1920).to(device)
    model.eval()
    torch.onnx.export(model, sample_input, model_p, input_names=[
                      'img'], output_names=['values'], opset_version=11)
    print('onnx model exported. forward now...')
    # forward now


if __name__ == "__main__":
    export_onnx()

@daquexian
Copy link
Owner

daquexian commented Nov 25, 2019

I think it is a bug of onnx (onnx/onnx#2417) and not related to onnxsim itself. Please re-export your onnx model according to what onnx/onnx#1385 (comment) suggests

@lucasjinreal
Copy link
Author

thanks for reply, better to say, this is a pytorch exporting bug. I have posted an optimization issue in pytorch.

However, do u have any suggestions on the simplifier result of such situation?

Why it does failed when convert? (actually I think the simplify process is right and reasonable) Just can not convert to tensorrt

@daquexian
Copy link
Owner

thanks for reply, better to say, this is a pytorch exporting bug. I have posted an optimization issue in pytorch.

No, it's an onnx bug, please check out onnx/onnx#2198

@daquexian
Copy link
Owner

Why it does failed when convert? (actually I think the simplify process is right and reasonable) Just can not convert to tensorrt

Have you tried re-export your onnx model by adding keep_initializers_as_inputs=True?

@lucasjinreal
Copy link
Author

That doesn't help. the intializers were generated after simplified.

What I am concern is that, this is a common issue, if you call upsample, intepolate, resize etc operation in your model, you will get a graph which is complicated:

image

But actually, we only might need a single resize op with a sizes params, however, this is current can not be done, and I don't know the root reason for this.

image

What if to change this param of sizes into type of anything else rather than initializer?

@lucasjinreal
Copy link
Author

The initializer might be the root reason. this can be solve on pytorch side, onnx side, or onnxsimplifier side, or even onnx-tensorrt side.

But none of them do this.....

@daquexian
Copy link
Owner

That doesn't help. the intializers were generated after simplified.

Sorry, I didn't understand you.

@lucasjinreal
Copy link
Author

lucasjinreal commented Nov 25, 2019

Sorry

Have you tried re-export your onnx model by adding keep_initializers_as_inputs=True?

this advise is not help.it's the same, and not the root reason for problem. Anyway, not related to onnxsimplifier. Since onnxsimpifier is just wrapper of onnx

@daquexian
Copy link
Owner

But actually, we only might need a single resize op with a sizes params, however, this is current can not be done

If you are asking about the roi<?> and scales<?> in the screenshot, onnx/onnx#2451 contains more information.

@nihui
Copy link

nihui commented Nov 25, 2019

我也遇到resize转换出来一堆乱七八糟东西的情况,没搞定,于是就直接改ncnn模型了

https://zhuanlan.zhihu.com/p/93017149

@lucasjinreal
Copy link
Author

nihui 大佬... 向大佬低头。

BTW, 我发现可以通过手动移花接木化解它,需要对ONNX做一些更加精细的外科手术

@luoduo21
Copy link

@nihui 大佬,按照你的意思pytorch中x=F.interpolate(input=x,size=(self.up_size, self.up_size), mode='bilinear') 导出onnx再转ncnn单独修改ncnn模型就行了吗,我使用op9可以导出来的是upsample op,我想问的是直接将这个转ncnn再修改ncnn模型,输出结果会一致吗!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants