-
Notifications
You must be signed in to change notification settings - Fork 202
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Export trained model to model.onnx file #32
Comments
@Chen-Xieyuanli can you comment on this? |
Hi @leo94-chen, here is my hack, which is not well-programmed, but you could take it as an example. https://gist.github.com/Chen-Xieyuanli/980cc0a2c9b664b9cc279b19d61aa898#file-make_onnx-py |
FWIW I opened a fork that contains code to export to ONNX: https://github.com/andrewkouri/lidar-bonnetal/blob/master/train/tasks/semantic/create_onnx.py Unfortunately I am still having problems opening the model in TensorRT because it is exported with an "ir_version" that is too new... will update if I figure out how to deal with it. |
This script works for me. Able to inference tensorRt without problem. I have tried with cuda 10.1, tensorrt.5.1 and cuda 10.2, tensorrt 6.0 Sorry forget to mention. I modified backbone. https://github.com/PRBonn/lidar-bonnetal/blob/master/train/backbones/darknet.py If i dont comment i got something aten error |
@akouri-dd @balajiravichandiran |
@caoyifeng001 you need to comment the line 159 in below file. I modified backbone. https://github.com/PRBonn/lidar-bonnetal/blob/master/train/backbones/darknet.py |
@balajiravichandiran |
We've experienced this too, it is because of a bug in the indexing operation when exporting to onnx. Since the model uses all channels, this is virtually a no op, so it works anyway |
I think it seems resolved. If you still have questions, please reopen the issue or comment on this issue. |
Hi, thanks for your kind sharing.
Could you please provide the code for exporting the trained model to model.onnx file? Thanks.
The text was updated successfully, but these errors were encountered: