New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"AliasWithName is not a registered function/op" when run converted onnx model #1868
Comments
As https://detectron2.readthedocs.io/modules/export.html#detectron2.export.Caffe2Tracer.export_onnx says
so this is working as expected |
Would you please give me some ideas to deal with it? I still vague about that, especially how to run detectron2 on onnxruntime. |
I saw that there are 3 options when I export model (caffe2, onnx, torchscript), why exported model only available in caffe2 ? why onnx exported model can not run on onnxruntime but needs post-processing ? |
|
I am facing the same issue |
I used tools/deploy/caffe2_converter.py to convert to onnx model. But when I initiated this model by onnxruntime on google colab, this throwed an exception:
[ONNXRuntimeError] : 1 : FAIL : Fatal error: AliasWithName is not a registered function/op
My code to load onnx model
`
def to_numpy(tensor):
return tensor.detach().cpu().numpy() if tensor.requires_grad else tensor.cpu().numpy()
class ONNX_DETECT:
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
detectron_model = ONNX_DETECT('/content/model.onnx', device.type)
`
Have anyone faced this issue?
The text was updated successfully, but these errors were encountered: