Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is TSN supported ? #10

Closed
wwdok opened this issue Dec 2, 2020 · 6 comments
Closed

Is TSN supported ? #10

wwdok opened this issue Dec 2, 2020 · 6 comments

Comments

@wwdok
Copy link

wwdok commented Dec 2, 2020

TSN means Time-Sensitive Networking that is used in action recognition, this is the model i want to convert, does this tool support this convertion ? or it support other action recognition model ?

@hxcai
Copy link
Collaborator

hxcai commented Dec 2, 2020

@wwdok We haven't tried it. You can use our tool to try.

@wwdok
Copy link
Author

wwdok commented Dec 3, 2020

I have just tried it out, but it reports error:

(base) weidawang@weidawang-TUF-Gaming-FX506LU-FX506LU:~/Repo/mmaction2/tools$ python onnx2tensorrt.py
In node -1 (importIf): UNSUPPORTED_NODE: Assertion failed: cond.is_weights() && cond.weights().count() == 1 && "If condition must be a initializer!"
[TensorRT] ERROR: Network must have at least one output
[TensorRT] ERROR: Network validation failed.
Traceback (most recent call last):
  File "onnx2tensorrt.py", line 10, in <module>
    trt_model = onnx2trt(model)
  File "/home/weidawang/miniconda3/lib/python3.7/site-packages/volksdep/converters/onnx2trt.py", line 119, in onnx2trt
    trt_model = TRTModel(engine)
  File "/home/weidawang/miniconda3/lib/python3.7/site-packages/volksdep/converters/base.py", line 50, in __init__
    self.context = self.engine.create_execution_context()
AttributeError: 'NoneType' object has no attribute 'create_execution_context'

@hxcai
Copy link
Collaborator

hxcai commented Dec 3, 2020

@wwdok Here is unsupported operation in onnx model, maybe you can try to use higher opset_version in torch2onnx to convert pytorch model to onnx, and then use onnx2trt to convert onnx to tensorrt engine.

@wwdok
Copy link
Author

wwdok commented Dec 4, 2020

Yes, the unsupported operation in onnx model is if, i found a solution here, and solved it by using torch 1.5 to export the onnx model, and now the tensorrt model also is successfully exported. In the meantime, i convert the onnx model by trtexec, i found the .engine file size exported by volksdep(101.6MB ) and trtexec(122.6MB) are different, so I am curious what are the differences inside, could you please tell me ?

@hxcai
Copy link
Collaborator

hxcai commented Dec 4, 2020

@wwdok Please make sure that the two engine files are generated in the same environment and with same params like platform, os, cuda version, tensorrt version, fp mode, work_spacesize and etc.. Also, trtexec is implement with c++ api and ours is implemented with python api, this may make different.

@wwdok
Copy link
Author

wwdok commented Dec 4, 2020

Yes, they are converted on the same environment, for fp mode i didn't set, i just use default( for volksdep is fp32 ). Thansk for your reply, i think i need more time to get familiar with tensorrt ~

@wwdok wwdok closed this as completed Dec 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants