You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Having ONNX Runtime using torch.Tensor (in addition to the current numpy) tensors is useful for the scenarios in which numpy does not support the data type used in the original torch model, such as torch.bfloat16.
Describe scenario use case
Today, we are forced to transform the onnx graph to convert bfloat16 into float16 due to umpy's lack of support for bfloat16
Supporting torch.Tensor direcly also makes ORT closer to PyTorch's original model, without numpy as a middle man
The text was updated successfully, but these errors were encountered:
Describe the feature request
Having ONNX Runtime using torch.Tensor (in addition to the current numpy) tensors is useful for the scenarios in which numpy does not support the data type used in the original torch model, such as torch.bfloat16.
Describe scenario use case
Today, we are forced to transform the onnx graph to convert bfloat16 into float16 due to umpy's lack of support for bfloat16
Supporting torch.Tensor direcly also makes ORT closer to PyTorch's original model, without numpy as a middle man
The text was updated successfully, but these errors were encountered: