Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

按照教程导出的模型,导入出错 #81

Closed
aixier opened this issue Dec 9, 2019 · 6 comments
Closed

按照教程导出的模型,导入出错 #81

aixier opened this issue Dec 9, 2019 · 6 comments

Comments

@aixier
Copy link

aixier commented Dec 9, 2019

https://github.com/PaddlePaddle/PaddleDetection/blob/release/0.1/docs/EXPORT_MODEL.md
使用:
[inference_program, feed_target_names, fetch_targets] = fluid.io.load_inference_model(dirname=save_freeze_dir, executor=exe)

Error Message Summary:

PaddleCheckError: Cannot open file G:\NewMj_project\frlib\cascade_rcnn_dcn_r50_fpn_1x\res4d_branch2c_weights for load op
at [D:\1.6.1\paddle\paddle/fluid/operators/load_op.h:37]
[operator < load > error]

@qingqing01
Copy link
Collaborator

qingqing01 commented Dec 9, 2019

@aixier 请给下导出模型时的运行命令,注意模型要加载通过指定weights正确设置~

[inference_program, feed_target_names, fetch_targets] = fluid.io.load_inference_model(dirname=save_freeze_dir, executor=exe)

中最好指定下 model_filenameparams_filename的名字。

@qingqing01 qingqing01 changed the title 按照教程导出的模型,导入出错。https://github.com/PaddlePaddle/PaddleDetection/blob/release/0.1/docs/EXPORT_MODEL.md 按照教程导出的模型,导入出错 Dec 9, 2019
@aixier
Copy link
Author

aixier commented Dec 9, 2019

python tools/export_model.py -c configs/dcn/cascade_rcnn_dcn_r50_fpn_1x.yml --output_dir=./inference_model_cas -o weights=/home/aistudio/work/PaddleDetection/ResNet50_cos_pretrained.tar YoloTestFeed.image_shape=[3,448,448] @qingqing01

@qingqing01
Copy link
Collaborator

@aixier 保存模型选错了,请注意选择cascade_rcnn_dcn_r50_fpn_1x对应的模型是:
https://paddlemodels.bj.bcebos.com/object_detection/cascade_rcnn_dcn_r50_fpn_1x.tar

另外:

[inference_program, feed_target_names, fetch_targets] = fluid.io.load_inference_model(dirname=save_freeze_dir, executor=exe)

中最好指定下 model_filename 和 params_filename的名字。

@aixier
Copy link
Author

aixier commented Dec 9, 2019

可以load了
[inference_program, feed_target_names, fetch_targets] = fluid.io.load_inference_model(dirname=path, executor=exe, model_filename="model", params_filename="params")
但是设置: im_info= np.array([800.,800.,1.])
然后
batch_outputs = exe.run(inference_program,
feed={feed_target_names[0]: tensor_img,
feed_target_names[1]: im_info,
feed_target_names[2]: image_shape[np.newaxis, :]},
fetch_list=fetch_targets,
return_numpy=False)

会报错
PaddleCheckError: Tensor holds the wrong type, it holds double, but desires to be float at [D:\1.6.1\paddle\paddle/fluid
/framework/tensor_impl.h:30]
@qingqing01

@aixier
Copy link
Author

aixier commented Dec 9, 2019

强制转换就行了 im_info= np.array([800.,800.,1.], dtype='float32')

@qingqing01
Copy link
Collaborator

@aixier 既然解决了,那就close了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants