-
-
Notifications
You must be signed in to change notification settings - Fork 264
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
failed to convert SAFMN #2504
Comments
Are they telling us that SAFMN ONNX models only support fixed resolutions? The error message in If so, then SAFMN ONNX models are pretty much useless, since they only support input images of a fixed resolution. |
The author said we could use maxpool instead. |
I tried that and it doesn't work. I just get a different error message: File "C:\Users\micha\Git\chaiNNer\backend\src\packages\chaiNNer_pytorch\pytorch\utility\convert_to_onnx.py", line 89, in convert_to_onnx_node
onnx_model_bytes = convert_to_onnx_impl(
File "C:\Users\micha\Git\chaiNNer\backend\src\nodes\impl\pytorch\convert_to_onnx_impl.py", line 56, in convert_to_onnx_impl
torch.onnx.export(
File "C:\Python38\lib\site-packages\torch\onnx\utils.py", line 516, in export
_export(
File "C:\Python38\lib\site-packages\torch\onnx\utils.py", line 1596, in _export
graph, params_dict, torch_out = _model_to_graph(
File "C:\Python38\lib\site-packages\torch\onnx\utils.py", line 1139, in _model_to_graph
graph = _optimize_graph(
File "C:\Python38\lib\site-packages\torch\onnx\utils.py", line 677, in _optimize_graph
graph = _C._jit_pass_onnx(graph, operator_export_type)
File "C:\Python38\lib\site-packages\torch\onnx\utils.py", line 1940, in _run_symbolic_function
return symbolic_fn(graph_context, *inputs, **attrs)
File "C:\Python38\lib\site-packages\torch\onnx\symbolic_helper.py", line 395, in wrapper
return fn(g, *args, **kwargs)
File "C:\Python38\lib\site-packages\torch\onnx\symbolic_helper.py", line 289, in wrapper
args = [
File "C:\Python38\lib\site-packages\torch\onnx\symbolic_helper.py", line 290, in <listcomp>
_parse_arg(arg, arg_desc, arg_name, fn_name) # type: ignore[assignment]
File "C:\Python38\lib\site-packages\torch\onnx\symbolic_helper.py", line 107, in _parse_arg
raise errors.SymbolicValueError(
torch.onnx.errors.SymbolicValueError: Failed to export a node '%143 : Long(requires_grad=0, device=cpu) = onnx::Cast[to=7](%142), scope: nodes.impl.pytorch.convert_to_onnx_impl.convert_to_onnx_impl.<locals>.FakeModel:: # C:\Python38\lib\site-packages\torch\_tensor.py:942:0
' (in list node %147 : int[] = prim::ListConstruct(%143, %146), scope: nodes.impl.pytorch.convert_to_onnx_impl.convert_to_onnx_impl.<locals>.FakeModel::
) because it is not constant. Please try to make things (e.g. kernel sizes) static if possible. [Caused by the value '147 defined in (%147 : int[] = prim::ListConstruct(%143, %146), scope: nodes.impl.pytorch.convert_to_onnx_impl.convert_to_onnx_impl.<locals>.FakeModel::
)' (type 'List[int]') in the TorchScript graph. The containing node has kind 'prim::ListConstruct'.]
Inputs:
#0: 143 defined in (%143 : Long(requires_grad=0, device=cpu) = onnx::Cast[to=7](%142), scope: nodes.impl.pytorch.convert_to_onnx_impl.convert_to_onnx_impl.<locals>.FakeModel:: # C:\Python38\lib\site-packages\torch\_tensor.py:942:0
) (type 'Tensor')
#1: 146 defined in (%146 : Long(requires_grad=0, device=cpu) = onnx::Cast[to=7](%145), scope: nodes.impl.pytorch.convert_to_onnx_impl.convert_to_onnx_impl.<locals>.FakeModel:: # C:\Python38\lib\site-packages\torch\_tensor.py:942:0
) (type 'Tensor')
Outputs:
#0: 147 defined in (%147 : int[] = prim::ListConstruct(%143, %146), scope: nodes.impl.pytorch.convert_to_onnx_impl.convert_to_onnx_impl.<locals>.FakeModel::
) (type 'List[int]') Here's what I changed: @@ -126,7 +126,7 @@ class SAFM(nn.Module):
for i in range(self.n_levels):
if i > 0:
p_size = (h // 2**i, w // 2**i)
- s = F.adaptive_max_pool2d(xc[i], p_size)
+ s = F.max_pool2d(xc[i], p_size)
s = self.mfr[i](s)
s = F.interpolate(s, size=(h, w), mode="nearest")
else: |
Information:
Description
failed to convert SAFMN (pth → onnx)
Logs
main.log
renderer.log
The text was updated successfully, but these errors were encountered: