Skip to content

ONNX Runtime and PyTorch results are different #20219

@W-QY

Description

@W-QY

Describe the issue

I designed and trained a 6D pose estimation algorithm model using pytorch. After that I use torch.onnx.export to convert the pth format parameter file into an onnx inference file. Through comparison, it was found that in some input cases (for example, the target in the image is small and the target background is pure black), the inference results using onnxruntime and pytorch are obviously inconsistent, resulting in a large difference in the results of the two (in this case The error results of both inference results are very large compared with the true value).

I want to know how to reduce or completely avoid the inference differences between onnxruntime and pytorch?

To reproduce

Our algorithm can be found at this link: https://github.com/YangHai-1218/PseudoFlow/blob/69e8e7ad11a2a58f06532cc5b89b76300d83613b/models/estimator/wdr_pose.py

Urgency

No response

Platform

Linux

OS Version

86~20.04.2-Ubuntu

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.15.1

ONNX Runtime API

Python

Architecture

X64

Execution Provider

CUDA

Execution Provider Library Version

CUDA 11.3

Metadata

Metadata

Assignees

No one assigned

    Labels

    ep:CUDAissues related to the CUDA execution providerstaleissues that have not been addressed in a while; categorized by a bot

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions