New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
light api无法执行量化模型推理 #5896
Comments
您好,我们已经收到了您的问题,会安排技术人员尽快解答您的问题,请耐心等待。请您再次检查是否提供了清晰的问题描述、复现代码、环境&版本、报错信息等。同时,您也可以通过查看官网文档、常见问题、历史Issue来寻求解答。祝您生活愉快~ Hi! We've received your issue and please be patient to get responded. We will arrange technicians to answer your questions as soon as possible. Please make sure that you have posted enough message to demo your request. You may also check out the API,FAQ and Github Issue to get the answer.Have a nice day! |
[机型]:骁龙845 补充:FP16模型可以正常加载并执行。 ————————————————————crash log—————————————————————— |
PaddleSlim通过静态图量化训练产出模型ppyolo_tiny_x_coco.zip,并测试了两套推理API:
__models__
和__params__
文件,推理正常;nb
文件,推理执行失败。Paddle-Lite版本:2.7
opt版本:2.7
操作系统:android
Run inference by using light api
使用Paddle-Lite 2.7版本的opt工具将量化训练产出模型转为『nb』文件,命令如下:
转换过程出现warning如下:
使用以下代码执行推理:
报错信息如下:
Run inference by using full api with CxxConfig
用以下代码加载量化模型,可以正常执行。其中重点设置了
cxx_config.set_valid_places
.The text was updated successfully, but these errors were encountered: