Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multiple inference problem #17

Closed
XiangjiBU opened this issue Dec 25, 2020 · 11 comments
Closed

multiple inference problem #17

XiangjiBU opened this issue Dec 25, 2020 · 11 comments

Comments

@XiangjiBU
Copy link

XiangjiBU commented Dec 25, 2020

hi, Thanks for sharing !
I ran multiple inference on Jetson Xavier (Jetpack 4.4), but no result detected. terminal print as follows..
I tested the 2 models used, each of them works well standalone.

Using winsys: x11
Deserialize yoloLayer plugin: yolo_99
Deserialize yoloLayer plugin: yolo_108
Deserialize yoloLayer plugin: yolo_117
0:00:03.522306324 30756 0x7f3c002380 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1701> [UID = 2]: deserialized trt engine from :/home/admin123/deepstream/DeepStream-Yolo/native/model_b16_gpu0_fp16_helmet.engine
INFO: [Implicit Engine Info]: layers num: 4
0 INPUT kFLOAT data 3x416x416
1 OUTPUT kFLOAT yolo_99 24x52x52
2 OUTPUT kFLOAT yolo_108 24x26x26
3 OUTPUT kFLOAT yolo_117 24x13x13

0:00:03.522553823 30756 0x7f3c002380 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1805> [UID = 2]: Use deserialized engine model: /home/admin123/deepstream/DeepStream-Yolo/native/model_b16_gpu0_fp16_helmet.engine
0:00:03.533651338 30756 0x7f3c002380 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<secondary_gie_0> [UID 2]: Load new model:/home/admin123/deepstream/DeepStream-Yolo/examples/multiple_inferences/sgie1/config_infer_secondary1.txt sucessfully
Deserialize yoloLayer plugin: yolo_51
Deserialize yoloLayer plugin: yolo_59
0:00:03.886455896 30756 0x7f3c002380 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1701> [UID = 1]: deserialized trt engine from :/home/admin123/deepstream/DeepStream-Yolo/native/model_b1_gpu0_fp16_personv3.engine
INFO: [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT data 3x416x416
1 OUTPUT kFLOAT yolo_51 18x13x13
2 OUTPUT kFLOAT yolo_59 18x26x26

0:00:03.886608479 30756 0x7f3c002380 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1805> [UID = 1]: Use deserialized engine model: /home/admin123/deepstream/DeepStream-Yolo/native/model_b1_gpu0_fp16_personv3.engine
0:00:03.888024542 30756 0x7f3c002380 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/home/admin123/deepstream/DeepStream-Yolo/examples/multiple_inferences/pgie/config_infer_primary.txt sucessfully

Runtime commands:
h: Print this help
q: Quit

p: Pause
r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.

** INFO: <bus_callback:181>: Pipeline ready

Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
** INFO: <bus_callback:167>: Pipeline running

WARNING: Num classes mismatch. Configured: 1, detected by network: 0
WARNING: Num classes mismatch. Configured: 1, detected by network: 0
WARNING: Num classes mismatch. Configured: 1, detected by network: 0
WARNING: Num classes mismatch. Configured: 1, detected by network: 0
WARNING: Num classes mismatch. Configured: 1, detected by network: 0
WARNING: Num classes mismatch. Configured: 1, detected by network: 0
WARNING: Num classes mismatch. Configured: 1, detected by network: 0
WARNING: Num classes mismatch. Configured: 1, detected by network: 0

@XiangjiBU
Copy link
Author

XiangjiBU commented Dec 25, 2020

config_infer_primary.txt

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-color-format=0

custom-network-config=yolov3_person.cfg
model-file=yolov3_person_best.weights
model-engine-file=model_b1_gpu0_fp16_personv3.engine
labelfile-path=labels.txt

batch-size=1
network-mode=2
num-detected-classes=1
interval=0
gie-unique-id=1
process-mode=1
network-type=0
cluster-mode=4
maintain-aspect-ratio=0
parse-bbox-func-name=NvDsInferParseYolo
custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so
engine-create-func-name=NvDsInferYoloCudaEngineGet

[class-attrs-all]
pre-cluster-threshold=0.25

@XiangjiBU
Copy link
Author

config_infer_secondary1.txt

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-color-format=0

custom-network-config=custom_yolov4_helmet.cfg
model-file=custom_yolov4_helmet_best.weights
model-engine-file=model_b16_gpu0_fp16_helmet.engine
labelfile-path=labels_helmet.txt

batch-size=16
network-mode=2
num-detected-classes=3
interval=0
gie-unique-id=2
process-mode=2
#operate-on-gie-id=1
#operate-on-class-ids=0
network-type=0
cluster-mode=4
maintain-aspect-ratio=0
parse-bbox-func-name=NvDsInferParseYolo
custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so
engine-create-func-name=NvDsInferYoloCudaEngineGet

[class-attrs-all]
pre-cluster-threshold=0.25

@XiangjiBU
Copy link
Author

deepstream_app_config.txt

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
type=3
uri=file:///opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4

num-sources=1
gpu-id=0
cudadec-memtype=0

[sink0]
enable=1
type=2
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
live-source=0
batch-size=1
batched-push-timeout=40000
width=1920
height=1080
enable-padding=0
nvbuf-memory-type=0

[primary-gie]
enable=1
gpu-id=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=pgie/config_infer_primary.txt

[secondary-gie0]
enable=1
gpu-id=0
gie-unique-id=2
operate-on-gie-id=1
operate-on-class-ids=0
nvbuf-memory-type=0
config-file=sgie1/config_infer_secondary1.txt

[tests]
file-loop=0

@marcoslucianops
Copy link
Owner

I will test this today

@marcoslucianops
Copy link
Owner

marcoslucianops commented Dec 27, 2020

Hi @XiangjiBU, sorry for the delay.

I found the problem, and I updated repo.

See multipleInferences.md

Thanks.

@XiangjiBU
Copy link
Author

Hi @XiangjiBU, sorry for the delay.

I found the problem, and I updated repo.

See multipleInferences.md

Thanks.

I tried new repo, and config exactly follow "multipleinference.md", but still not work
can you show where is the problem ?

@marcoslucianops
Copy link
Owner

marcoslucianops commented Dec 28, 2020

I tried new repo, and config exactly follow "multipleinference.md", but still not work

can you show where is the problem ?

Put all files (cfg/weights/labels) in deepstream/sources/yolo directory (without pgie/sgie folders) and use only one nvdsinfer_custom_impl_Yolo folder for all inference engines.

If it doesn't work, try to rebuild model with this new folder.

@XiangjiBU
Copy link
Author

XiangjiBU commented Dec 29, 2020

I tried new repo, and config exactly follow "multipleinference.md", but still not work
can you show where is the problem ?

Put all files (cfg/weights/labels) in deepstream/sources/yolo directory (without pgie/sgie folders) and use only one nvdsinfer_custom_impl_Yolo folder for all inference engines.

If it doesn't work, try to rebuild model with this new folder.

thanks, I tried again, but still find 2 issues:

  1. I put them in yolo folder, but I find that, the secondary gie detect plenty of bboxes, I try to set "cluster-mode=2", but still get planty of bboxes. can you help me to figure out what happened?
  2. when primary-gie and secondary-gie are different version yolo model, this repo seems not works(for example: pgie is custom yolov4, and sgie1 is custom yolov3). did I do some thing wrong or this repo is indeed not support that ?

@marcoslucianops
Copy link
Owner

marcoslucianops commented Dec 29, 2020

  1. I put them in yolo folder, but I find that, the secondary gie detect plenty of bboxes, I try to set "cluster-mode=2", but still get planty of box. can you help me to figure out what happened?

cluster-mode is used to set which NMS mode will be used in DeepStream. In my code, NSM function is added to nvdsparsebbox_Yolo.cpp for YOLO models v3 and v4. Using cluster-mode=2, you will add another NMS after coded NMS, therefore, is better to use cluster-mode=4.

To decrease number of bboxs, you need to increase pre-cluster-threshold, where 0 is 0% and 1.0 is 100% of confidence to show bbox.

  1. when primary-gie and secondary-gie are different version yolo model, this repo seems not works(for example: pgie is custom yolov4, and sgie1 is custom yolov3).

I believe it will work because the code is the same for all models. It only differs in kernel, where it calls different functions for each model.

I tested only with YOLOv4, but I will do future tests with other models.

@marcoslucianops
Copy link
Owner

marcoslucianops commented Dec 31, 2020

Hi @XiangjiBU

Please see my multipleInferences.md again. I reverted files and updated them. Now you can use different versions/models with separated gie's folders without errors (especially see Editing yoloPlugin.h section).

@XiangjiBU
Copy link
Author

XiangjiBU commented Jan 1, 2021

Please see my multipleInferences.md again. I reverted files and updated them. Now you can use different versions/models with separated gie's folders without errors (especially see Editing yoloPlugin.h section).

it works, THX !!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants