Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SparseInst export to ONNX input data format #75

Closed
LukasMahieuArinti opened this issue Jul 5, 2022 · 8 comments
Closed

SparseInst export to ONNX input data format #75

LukasMahieuArinti opened this issue Jul 5, 2022 · 8 comments

Comments

@LukasMahieuArinti
Copy link

Hi,
I tried exporting the weights of SparseInst (GIAM) that are in this repository to ONNX format using export.py with the following command (I assume I need to use this command? The documentation reads 'use export_onnx.py', but there's no export_onnx.py in the current branch of this repository).

python export.py --config-file configs/coco/sparseinst/sparse_inst_r50_giam_aug.yaml --opts MODEL.WEIGHTS weights/base_giam.pth INPUT.MIN_SIZE_TEST 512

This leads to the following issue:

  File ".../yolov7/yolov7/modeling/meta_arch/sparseinst.py", line 95, in <listcomp>
    images = [x["image"].to(self.device) for x in batched_inputs]
IndexError: too many indices for tensor of dimension 3

Any ideas on how to fix this?
Thank you.

@lucasjinreal
Copy link
Owner

It was renamed. An onnx file already generated if I am guess it right. Just ignore the error.

@LukasMahieuArinti
Copy link
Author

I actually want to run the onnx conversion on my custom trained model as well, which is why I'm looking to get the error fixed :/
(Sidenote; I can't download the pretrained SparseInst ONNX file from google drive either, it seems to have been removed)

@lucasjinreal
Copy link
Owner

onnx is int8 which seems not work at this moment.

@LukasMahieuArinti
Copy link
Author

Okay, thanks for the reply. I'll try to look for a fix myself then, I'll let you know if I manage to find any.

@LukasMahieuArinti
Copy link
Author

It looks like onnx export work fine, it's the check that happens afterwards that fails. This happens because the forward pass through the Sparseinst network expects different inputs in training than evaluation.
To fix this, just comment out this check in export.py or change this line and this line to something like if self.onnx_export:, so that the forward pass goes through those specific preprocessing steps as well during testing.

@lucasjinreal
Copy link
Owner

the onnx exportation already finished in torch.onnx.export, the rest of code is just test onnx inference logic, this is not necessary and definitely could fail. Just ignore..

@mauricewells
Copy link

Then, how can I test onnx inference logic without the rest of code?

@jackiewangCV
Copy link

I am asking the same question as mauricewells, as well
Please give me the solution

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants