-
Notifications
You must be signed in to change notification settings - Fork 644
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does BlendMask supports export to onnx? #43
Comments
This utility is not implemented in this version yet. It is in the plan. However, we have tested it and it should be fairly easy to implement. |
It seems community already have some exploration convert FCOS to ncnn, it would be very nice if BlendMask can support export to onnx and acceleration by TensorRT or implemented to mobile platform. Hoping for your guys updates. |
@stan-haochen |
@blueardour I can train a bn head BlendMask for u to convert. |
@jinfagang Sorry for the late rely I'm very willing to do the convert if you could provide the BN head model. |
@blueardour OK, let's do it, i'd like convert to tensorrt once it can be converted to onnx. |
thank a lot for the work, is threre any new about the issue |
no, bn head model training pended... |
@jinfagang @snaillp I've discussed with @stan-haochen. We would pay some effort on training and converting the model. |
@blueardour A blendmask RT model trained with bn only https://drive.google.com/open?id=1wMqOxOKCSeTRX_-xOomREbPNE3Ec87Ur
mAP with 30 trained so far, we can try figure out convert to onnx first. |
I will provide BN-head models for RT_R50 this week. |
@jinfagang Thanks for the training file. I'm working on the covert. |
Waiting for your progress |
@ALL From the exporting the onnx, it requires opset 11 to give consistent result with pytorch. However, in onnxruntime/caffe2/... software, they currently only support opset 9. A work around is change the bilinear to nearest mode and omit the align_corners parameter. For this issue, refer As it is not 100% perfectly completed, I only submit the code in my only repo rather than applying a PR in this repo. https://github.com/blueardour/uofa-AdelaiDet |
@blueardour How many onnx ops envolved in converted onnx? Is it can be converted to trt engine through onnx-tensorrt? |
yes, the both verification in tensorrt and onnxruntime are added. check the new PR, please |
@blueardour Did u using onnxruntime's TensorRT backend for checking tensorrt compatible or using onnx-tensorrt converting onnx to engine? |
@jinfagang I found I made a mistake by take the caffe2 engine, I'm swtiching to the TensorRT engine |
tensorrt might can not properly convert several ops such as RoiAlign, and need some surgery to change write way in pytorch code before export. |
@jinfagang I fix the onnx verification script and add an one-in-all script demo Note that I pass the test for FCOS. Failed for Blendmask as it contains unsupported UpSample type. |
block ROIAlign and NMS so it can only forward previous modules can not forward whole model? |
Yes. As those layers seem not to be standard supported in many frameworks and even some frameworks have the support but with a different implementation, layers such as ROIAlign and NMS are not exported. In the export script, one can check the all the exported nodes by dumping the output_names |
@blueardour Did u managed convert the onnx to tensorrt? I hav a work around it to trt engine.
|
I have trained a realtime version of R-50 model, mask AP 35.1/FPS 31@1080Ti. You can find it here: https://github.com/aim-uofa/AdelaiDet/blob/master/configs/BlendMask/README.md#blendmask-real-time-models |
@jinfagang I only test the onnx file by the onnx_tensorrt package. Refer line 270/271 in https://github.com/blueardour/uofa-AdelaiDet/blob/master/onnx/test_onnxruntime.py I didn't test in a standalone tensorrt. Ps. My PR seems not approved yet. refer: #80 |
@blueardour Will u also written a standalone inference repo based on onnx in ncnn or some other framework? So that we can inference it and visiaulize result. |
Sorry @jinfagang I'm currently busing preparing my conference paper. The deadline is coming. I can spare time about mid June on this issue. If you cannot wait so long, please refer my FCOS implementation. The only remained work is to add the basis module branch in NCNN. |
@blueardour I'd like to finish Blendmask part with your help (if you don't have time to do it), do u have a slack or telegram to talk with? wechat is also ok |
Please leave a message to my email: blueardour@gmail.com |
@blueardour Hi, still the BlendMask problem. I looked around and found only this method have a high accuracy and high speed. Do u have a plan to export the onnx model with postprocess included as much as possible? |
Sorry. Current get no hand on developing the postprocessing code. You might contact me by the email if you have a strong will to implement the whole network. BTW, if speed is a concern, might my new rep on model quantization will be helpful. https://github.com/blueardour/model-quantization Note detection/segmentation related files are still in preparation. |
@blueardour Tried contact u with email without a response. do u have a more instant contact such as telegram or something? |
@jinfagang Hi, I checkout the mail box and be sure having replied you. As your original email has not subject, might it be possible filtered by the Google Mail? Anyway, I edit the subject of mail and reply you again. Also, feel free to contact me with the wechat leave in the email. |
Hi. Does BlendMask support export to onnx now? |
Try the following steps (path/filename should be revised based on your own machine):
|
Hello, I have converted blendmask into onnx model,according to the code you provided. What is the meaning of the output of the converted onnx model? |
@linhaoqi027 Hi, could you please provide your successfully converted onnx model? It will be very helpful. I got stuck at converting any pth model into onnx, the ArrayRef errors popup. |
Hello, how do u run the script(export_model_to_onnx.py). when i run it to export BlendMask, got error: |
Hi, have you solved it? I have the same problem. |
you can use orther model that donn't contain the "_ModulatedDeformConv". for example: blendmask_550_r_50_3x.pth.
|
|
1 similar comment
|
Does BlendMask supports export to onnx?
The text was updated successfully, but these errors were encountered: