Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When exporting from torch, the result of the input varies depending on the dummy input. #6031

Closed
parkkyungjun opened this issue Mar 24, 2024 · 2 comments
Labels
question Questions about ONNX

Comments

@parkkyungjun
Copy link

parkkyungjun commented Mar 24, 2024

Ask a Question

Question

When exporting from torch, the result of the input varies depending on the dummy input. What's wrong with this?

Further information

  • Relevant Area:

torch.onnx.export(model, (dummy_input, th), model_path, verbose=True, export_params=True,
opset_version=12,
do_constant_folding=True,
input_names=['input', 'threshold'], output_names=['output1', 'output2'])

in my model forward
there is torch.cdist function

I am comparing the value saved in advance when performing the cdist operation with the output of the model after storing it in the model class's self.

I think cdist is the cause. And when using cdist, the speed is also very slow (cdist take almost 6 second )
help me

  • Is this issue related to a specific model?
    Model name:
    Model opset:

Notes

@parkkyungjun parkkyungjun added the question Questions about ONNX label Mar 24, 2024
@parkkyungjun parkkyungjun changed the title When exporting from torch, the result of the input varies depending on the dummy input. What's wrong with this? When exporting from torch, the result of the input varies depending on the dummy input. help me Mar 24, 2024
@parkkyungjun parkkyungjun changed the title When exporting from torch, the result of the input varies depending on the dummy input. help me When exporting from torch, the result of the input varies depending on the dummy input. Mar 24, 2024
@gramalingam
Copy link
Contributor

I think it would be better to ask this question in the pytorch (to onnx converter) repo. @thiagocrepaldi

@parkkyungjun
Copy link
Author

parkkyungjun commented Mar 27, 2024

I think it would be better to ask this question in the pytorch (to onnx converter) repo. @thiagocrepaldi

I solved this issue by addressing a cause from another problem
Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Questions about ONNX
Projects
None yet
Development

No branches or pull requests

2 participants