You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wanted to reduce the size of rtmpose-tiny model through NCNN quantization.
Converting to NCNN without quantization is fine but when the quantization function is on I got an error like below.
Quantization for RTMPose is not supported yet?
Error
loading annotations into memory...
Done (t=0.23s)
creating index...
index created!
2023-04-11:14:10:34 - root - ERROR - 0
Traceback (most recent call last):
File "/mmdeploy/mmdeploy/utils/utils.py", line 41, in target_wrapper
result = target(*args, **kwargs)
File "/mmdeploy/tools/onnx2ncnn_quant_table.py", line 47, in get_table
input_tensor = input_data[0]
KeyError: 0
04/11 14:10:34 - mmengine - ERROR - tools/deploy.py - create_process - 82 - ncnn quant table failed.
I wanted to reduce the size of rtmpose-tiny model through NCNN quantization.
Converting to NCNN without quantization is fine but when the quantization function is on I got an error like below.
Quantization for RTMPose is not supported yet?
Error
loading annotations into memory...
Done (t=0.23s)
creating index...
index created!
2023-04-11:14:10:34 - root - ERROR - 0
Traceback (most recent call last):
File "/mmdeploy/mmdeploy/utils/utils.py", line 41, in target_wrapper
result = target(*args, **kwargs)
File "/mmdeploy/tools/onnx2ncnn_quant_table.py", line 47, in get_table
input_tensor = input_data[0]
KeyError: 0
04/11 14:10:34 - mmengine - ERROR - tools/deploy.py - create_process - 82 - ncnn quant table failed.
And I used this command for deployment.
python tools/deploy.py configs/mmpose/pose-detection_simcc_ncnn-fp16_static-256x192.py ../mmpose/projects/rtmpose/rtmpose/body_2d_keypoint/rtmpose-t_8xb256-420e_coco-256x192.py ../mmpose/rtmpose-tiny_simcc-coco_pt-aic-coco_420e-256x192.pth demo/resources/human-pose.jpg --work-dir test3 --device cpu --quant
The text was updated successfully, but these errors were encountered: