Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2.7.1版本下,andorid预测输出出现nan,同样的模型在2.5.0版本上正常 #2628

Closed
yangy996 opened this issue Oct 24, 2023 · 10 comments
Labels

Comments

@yangy996
Copy link

e32dd3e6a717aec7afe2daacd4d549cd
src.zip

@yangy996
Copy link
Author

这个模型在python的MNN=2.7.1版本实测是正常的,在Android上就会报nan的错误

@v0jiuqi
Copy link
Collaborator

v0jiuqi commented Oct 27, 2023

上传ONNX原模型给我们查一下

@yangy996
Copy link
Author

上传ONNX原模型给我们查一下

det.onnx

@v0jiuqi
Copy link
Collaborator

v0jiuqi commented Oct 27, 2023

上传ONNX原模型给我们查一下

det.onnx

下不了

@yangy996
Copy link
Author

上传ONNX原模型给我们查一下

det.zip

@v0jiuqi
Copy link
Collaborator

v0jiuqi commented Dec 15, 2023

这个模型在fp32精度下用两个版本去跑,结果是一致的,没有nan。你是在fp16精度下跑的吗

@yangy996
Copy link
Author

这个模型在fp32精度下用两个版本去跑,结果是一致的,没有nan。你是在fp16精度下跑的吗

image
设置backendConfig.precision = MNN::BackendConfig::Precision_Low;的时候,识别就出现nan
设置backendConfig.precision = MNN::BackendConfig::Precision_Low_BF16;的时候,识别就正常

@yangy996
Copy link
Author

这个模型在fp32精度下用两个版本去跑,结果是一致的,没有nan。你是在fp16精度下跑的吗

还有个问题backendConfig.power = MNN::BackendConfig::Power_High;CPU占用比MNN::BackendConfig::Power_Low;高30%左右

@v0jiuqi
Copy link
Collaborator

v0jiuqi commented Dec 26, 2023

这个模型在fp32精度下用两个版本去跑,结果是一致的,没有nan。你是在fp16精度下跑的吗

image 设置backendConfig.precision = MNN::BackendConfig::Precision_Low;的时候,识别就出现nan 设置backendConfig.precision = MNN::BackendConfig::Precision_Low_BF16;的时候,识别就正常

嗯。Low就是fp16

Copy link

Marking as stale. No activity in 60 days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants