You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I tried to use pre-allocated buffer by set_output_tensor of InferRequest, but it seems not allowed Can't SetBlob with name: Clamp_1431, because model output and blob are incompatible..
The IR model I used has dynamic inputs and output. I used tool mo to convert it from onnx and load it in c++, try to set pre-allocated buffer for output tensor when inferencing but failed.
So what's the problem ?
Thanks
The text was updated successfully, but these errors were encountered:
Hi, I tried to use pre-allocated buffer by
set_output_tensor
ofInferRequest
, but it seems not allowedCan't SetBlob with name: Clamp_1431, because model output and blob are incompatible.
.The IR model I used has dynamic inputs and output. I used tool
mo
to convert it from onnx and load it in c++, try to set pre-allocated buffer for output tensor when inferencing but failed.So what's the problem ?
Thanks
The text was updated successfully, but these errors were encountered: