We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
简化 pytorch 前向代码, 但是不是有 if, else 时通过 torch --> onnx --> mnn。 torch, onnx, mnn 的推理都可以出结果。 但是,使用含有 if, else 时,torch ,onnx 推理正常,mnn 崩溃。 使用 GetMNNinfo 的获取二者mnn 模型信息,发现 inputs 有差异。 no-if 的 mnn 模型信息:【xs, right_context, is_last, decode_chunk_size】 had-if的 mnn 模型信息:【xs, right_context, is_last, /Concat_output_0,ecode_chunk_size】 猜测:had-if的 mnn 多出来一个 /Concat_output_0 导致崩溃, 如果猜测正确,请问如果解决该转换问题?
forward(self, xs:torch:Tensor, decode_chunk_size:torch.Tensor, right_context:torch.Tensor, is_last:torch.Tensor) --> torch.Tesnsor: xs1 = xs right_context = right_context.squeeze(0) is_last = is_last.squeeze(0) decode_chunk_size = decode_chunk_size.squeeze(0) tmp = right_context + is_last xs2 = xs1[:, :tmp] xs3 = xs1[:, tmp:] xs3 = torch.cat([xs2,xs3],dim=1) if is_last > 0: ys = xs3[:,:decode_chunk_size] else: ys = xs3[:,:decode_chunk_size] return ys
forward(self, xs:torch:Tensor, decode_chunk_size:torch.Tensor, right_context:torch.Tensor, is_last:torch.Tensor)
--> torch.Tesnsor:
xs1 = xs
right_context = right_context.squeeze(0)
is_last = is_last.squeeze(0)
decode_chunk_size = decode_chunk_size.squeeze(0)
tmp = right_context + is_last
xs2 = xs1[:, :tmp]
xs3 = xs1[:, tmp:]
xs3 = torch.cat([xs2,xs3],dim=1)
if is_last > 0:
ys = xs3[:,:decode_chunk_size]
else:
return ys
onnx_model.zip
The text was updated successfully, but these errors were encountered:
使用 tesMNNFromOnnx.py 报错core dump. 单步运行MNNConvert -f ONNX ... 测试, 报了 /Concat_output_0 is input but not found.
Sorry, something went wrong.
已经复现,定位中
已经修正,请等待同步,或用如下 patch 修正 onnx.patch
感谢,修改代码后,验证正常了。
No branches or pull requests
平台版本:X86 linux 平台: MNN-2.8.1
编译: 源码默认选项编译工具, pymnn 也是使用源码编译的方式;
问题: 在python 使用 nn.load_module_form_file 接口推理带有 if 控制流的模型,推理引擎崩溃。 MNN 支持控制流的模型是否正常?
问题分析:
简化 pytorch 前向代码, 但是不是有 if, else 时通过 torch --> onnx --> mnn。 torch, onnx, mnn 的推理都可以出结果。
但是,使用含有 if, else 时,torch ,onnx 推理正常,mnn 崩溃。 使用 GetMNNinfo 的获取二者mnn 模型信息,发现 inputs 有差异。
no-if 的 mnn 模型信息:【xs, right_context, is_last, decode_chunk_size】
had-if的 mnn 模型信息:【xs, right_context, is_last, /Concat_output_0,ecode_chunk_size】
猜测:had-if的 mnn 多出来一个 /Concat_output_0 导致崩溃, 如果猜测正确,请问如果解决该转换问题?
代码:代码为项目中简化的代码,方便测试,故有些冗余的写法,如下:
forward(self, xs:torch:Tensor, decode_chunk_size:torch.Tensor, right_context:torch.Tensor, is_last:torch.Tensor)
--> torch.Tesnsor:
xs1 = xs
right_context = right_context.squeeze(0)
is_last = is_last.squeeze(0)
decode_chunk_size = decode_chunk_size.squeeze(0)
tmp = right_context + is_last
xs2 = xs1[:, :tmp]
xs3 = xs1[:, tmp:]
xs3 = torch.cat([xs2,xs3],dim=1)
if is_last > 0:
ys = xs3[:,:decode_chunk_size]
else:
ys = xs3[:,:decode_chunk_size]
return ys
Onnx 模型:
onnx_model.zip
The text was updated successfully, but these errors were encountered: