New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with Barracuda inference results using tf2onnx converted BlazePose model. #91
Comments
Stepping through the outputs of each node in Barracuda & Onnx Runtime, the results appear to diverge at the Padding layer (layer 17). Barracuda also displays the warning only spacial padding is supported. I'm not sure how to interpret the padding attributes, could the padding be for non-spacial dimensions which is unsupported in Barracuda? If so is support for pad layers with non-spacial padding in the road map for future features? |
Hi! @InternetSalmon I checked your model. You were correct with your debugging, errors starts appearing starting to the The padding attributes are interpreted as follows: As of yet, we do not support this operation. It doesn't seem so hard to do, so we will ad it to the list. |
Thanks @AlexRibard, appreciate it! I had a look at Pad.compute, unfortunately I don't know much HLSL but would the aim be to have a CHW version of
|
I unfortunately can't give you a precise estimate to when we will implement it. |
Thanks Alex, i'll keep watch of the release logs and give the |
Hi @InternetSalmon |
Thanks @AlexRibard, that is exciting! Appreciate all the work the team has been doing and looking forward to testing out 1.2.0!
Is the onnx-runtime error that it would not load or different inference results? I had uploaded the opset 13 converted model and that may have been why, I have attached opset 9. I've been working with onnxruntime v1.1.0
|
Ah indeed, we didn't implement non spacial padding for 1.2.0 Let me spend some time getting the non spacial padding working for you. It won't make it in 1.2.0 but I can send you the CS |
Hey Alex, if that is possible would be amazing and highly appreciated! |
I was able to get padding working and the correct inference result although the changes are a bit of a hack... Looking forward to a official implementation to see how it is done properly! I was unsure on how to approach TensorExtensions.ApplyBorder() it seems to iterate |
Hi @AlexRibard - have there been any updates on support for spatial padding? I believe I'm seeing the same issue as the original poster:
Thanks! |
@JoeProgram can you share the model please? |
@JoeProgram I was able to produce correct output from the model in my project after making some changes to the padding compute shader. An official fix would be greatly appreciated. |
What changes are needed actually? |
Commenting on this thread to say that official support for non spatial padding has been added. |
Hi,
I have a model I've converted from TensorFlow to Onnx with tf2onnx, using TF and OnnxRuntime the inference is the same with matching output using the same test data.
Tensorflow model
Onnx model
Using the same onnx model and test data with Barracuda I get drastically different inference results.
I believe i'm creating the Barracuda input tensor correctly, and there is a error occurring in the inference.
I've attached my Unity Barracuda test project, as well as Jupyter notebook and tensorflow + onnx models.
Unity:
InferenceTest.zip
Jupyter + Models:
Jupyter test and models
Any insight into the different output when using Barracuda would be greatly appreciated!
The text was updated successfully, but these errors were encountered: