Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

运行检测模型卷积通道剪裁示例时报错 #30

Closed
liu36259069 opened this issue Nov 20, 2019 · 1 comment · Fixed by #33
Closed

运行检测模型卷积通道剪裁示例时报错 #30

liu36259069 opened this issue Nov 20, 2019 · 1 comment · Fixed by #33

Comments

@liu36259069
Copy link

当我运行python compress.py
-s yolov3_mobilenet_v1_slim.yaml
-c ../../configs/yolov3_mobilenet_v1.yml
-o max_iters=20
num_classes=4
YoloTrainFeed.batch_size=32
pretrain_weights=/home/aistudio/PaddleDetection/output/yolov3_mobilenet_v1/best_model
-d "/home/aistudio/work/coco"
这个命令时一直报错以下错误,经定位是eval_utils.py将gt_box的值转换成im_id异常,异常信息如下

loading annotations into memory...
Done (t=0.01s)
creating index...
index created!
[ 16. 115. 218. 374.]
Traceback (most recent call last):
File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/contrib/slim/core/compressor.py", line 593, in run
self._eval(context)
File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/contrib/slim/core/compressor.py", line 542, in _eval
func(self.eval_graph.program, self.scope))
File "compress.py", line 207, in eval_func
FLAGS.output_eval)
File "../../ppdet/utils/eval_utils.py", line 205, in eval_results
is_bbox_normalized=is_bbox_normalized)
File "../../ppdet/utils/coco_eval.py", line 86, in bbox_eval
results, clsid2catid, is_bbox_normalized=is_bbox_normalized)
File "../../ppdet/utils/coco_eval.py", line 215, in bbox2out
im_id = int(im_ids[i][0])
TypeError: only size-1 arrays can be converted to Python scalars
2019-11-20 20:32:00,491-ERROR: None
2019-11-20 20:32:00,491-ERROR: None
2019-11-20 20:32:01,633-INFO: epoch:1; batch_id:0; odict_keys(['loss', 'lr']) = [117.678, 0.0]

后面通过排查发现compress.py 将gt_box的值当成im_id出入导致报错,
在第79行新增outs.append(data['im_id']) 解决问题

QQ图片20191120204048

@qingqing01
Copy link
Collaborator

@liu36259069 感谢反馈,我们修复下~

HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
HCLAC pushed a commit to HCLAC/PaddleDetection that referenced this issue May 23, 2024
版本上线

Closes PaddlePaddle#34PaddlePaddle#33PaddlePaddle#32 以及 PaddlePaddle#30

See merge request frontend/fengyan-mp!5
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants