-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Description
Thanks for your error report and we appreciate it a lot.
Checklist
- I have searched related issues but cannot get the expected help.
- The bug has not been fixed in the latest version.
Describe the bug
A clear and concise description of what the bug is.
when I ran 'python tools/pytorch2onnx.py', and other arguments of course, I can get the correct output of onnx file. But when I change the batchsize of input_shape from 1 to 8(or any other nums > 1), I got an error.
Reproduction
-
What command or script did you run?
python tools/pytorch2onnx.py (other arguments worked when the input shape is 1,3,h,w) -
Did you make any modifications on the code or config? Did you understand what you have modified?
if args.shape is None: img_scale = cfg.test_pipeline[1]['img_scale'] input_shape = (8, 3, img_scale[1], img_scale[0]) elif len(args.shape) == 1: input_shape = (8, 3, args.shape[0], args.shape[0]) elif len(args.shape) == 2: input_shape = ( 8, 3, ) + tuple(args.shape) else: raise ValueError('invalid input shape')
I changed the input shape from 1,3,h,w to 8,3,h,w
3. What dataset did you use?
It doesn't matter.
Environment
pytorch 1.6; mmcv 1.3.7; mmseg 0.14.0
- Please run
python mmseg/utils/collect_env.pyto collect necessary environment infomation and paste it here. - You may add addition that may be helpful for locating the problem, such as
- How you installed PyTorch [e.g., pip, conda, source]
pip - Other environment variables that may be related (such as
$PATH,$LD_LIBRARY_PATH,$PYTHONPATH, etc.)
no
Error traceback
- How you installed PyTorch [e.g., pip, conda, source]
Traceback (most recent call last):
File "tools/pytorch2onnx_old.py", line 212, in <module>
verify=args.verify)
File "tools/pytorch2onnx_old.py", line 120, in pytorch2onnx
opset_version=opset_version)
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/onnx/__init__.py", line 208, in export
custom_opsets, enable_onnx_checker, use_external_data_format)
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/onnx/utils.py", line 92, in export
use_external_data_format=use_external_data_format)
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/onnx/utils.py", line 530, in _export
fixed_batch_size=fixed_batch_size)
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/onnx/utils.py", line 366, in _model_to_graph
graph, torch_out = _trace_and_get_graph_from_model(model, args)
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/onnx/utils.py", line 319, in _trace_and_get_graph_from_model
torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True)
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/jit/__init__.py", line 338, in _get_trace_graph
outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/jit/__init__.py", line 426, in forward
self._force_outplace,
File "/workdir/lxj/seg_3.6/lib/python3.6/site-packages/torch/jit/__init__.py", line 415, in wrapper
out_vars, _ = _flatten(outs)
RuntimeError: Only tuples, lists and Variables supported as JIT inputs/outputs. Dictionaries and strings are also accepted but their usage is not recommended. But got unsupported type numpy.ndarray
Bug fix
If you have already identified the reason, you can provide the information here. If you are willing to create a PR to fix it, please also leave a comment here and that would be much appreciated!