We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I tried this code:
import torch import torchvision import onnx model = torchvision.models.resnet18() dummy_input = torch.randn(1, 3, 300, 300) torch.onnx.export(model, dummy_input, './play.onnx', opset_version=11) model = onnx.load('./play.onnx') onnx.checker.check_model(model)
My onnx version is 1.6 installed with python -m pip install onnx, and my pytorch version is 1.3 installed with conda install -c pytorch pytorch.
python -m pip install onnx
conda install -c pytorch pytorch
Please tell me why there is core dump here, and how could I make it work?
The text was updated successfully, but these errors were encountered:
there has any advise? I met the same issue... but I just follow the tutorials https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html
Sorry, something went wrong.
I did not look closer at the cause of this error, but I solved it by importing onnx before importing torch:
import onnx import torch
Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
No branches or pull requests
I tried this code:
My onnx version is 1.6 installed with
python -m pip install onnx
, and my pytorch version is 1.3 installed withconda install -c pytorch pytorch
.Please tell me why there is core dump here, and how could I make it work?
The text was updated successfully, but these errors were encountered: