Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lavfi/dnn: Unification of Async and Sync Modes #423

Closed
wants to merge 6 commits into from

Conversation

shubhanshu02
Copy link
Contributor

@shubhanshu02 shubhanshu02 commented Jul 10, 2021

Patch Set Description

This patchset is a part of optional deliverables in the GSoC project Async Support for TensorFlow Backend in FFmpeg.

Objective: Unification of async and sync modes in the DNN backends from the filters' perspective.
Parts under this deliverable:
- Unifies the async and sync modes in the DNN backend from the filters' perspective.
- Adds TaskItem based inference in Native backend.
- Renames InferenceItem to LastLevelTaskItem.

Methodology

In this patchset, we plan to deliver a common inference function for async and sync modes to the filters (as compared to the two functions ff_dnn_execute_model_async and ff_dnn_execute_model). This will simplify the code on the filter side and the filters don't have to care about the mode of execution. With this patchset, the image frames are sent first by the frames and then collected using the ff_dnn_get_result function which sends back the input and output frames in a sequential manner.

The execution mode can now be set by using the async flag in the backend_configs as follows:

  • async mode: backend_configs=async=1
  • sync mode: backend_configs=async=0

Another part of this patchset contains the renaming of the InferenceItem in the three backends to LastLevelTaskItem and the relevant changes to avoid further confusion in the meaning of these structs.

Patches

2c2a825 Task-based Inference in Native Backend
5db5789 Unify Execution Modes in DNN Filters
c0b4154 Remove synchronous functions from DNN filters
42ce1db Remove Async Flag from DNN Filter Side
5f0a0ed Rename InferenceItem to LastLevelTaskItem
85fed1e Include dnn_processing in docs of sr and derain filter

@shubhanshu02 shubhanshu02 force-pushed the unification branch 3 times, most recently from eee1825 to 160b77a Compare August 13, 2021 20:33
@shubhanshu02 shubhanshu02 force-pushed the unification branch 3 times, most recently from 81704ce to 092a580 Compare August 15, 2021 08:01
@guoyejun
Copy link
Collaborator

the four patches relative to last level task can be merged together, I see it would be large (thanks for considering small patches), it is more nature to be together.

@guoyejun
Copy link
Collaborator

guoyejun commented Aug 20, 2021

looks good to me, please send to ffmpeg community. thanks.

@Semmer2 could you also have a test after the patches are sent, thanks.

@shubhanshu02
Copy link
Contributor Author

Sure, I'll send it in a few minutes after rebasing to the latest master. Thank you for reviewing.

@shubhanshu02 shubhanshu02 force-pushed the unification branch 4 times, most recently from 145434e to 8a214fc Compare August 23, 2021 20:18
@guoyejun
Copy link
Collaborator

lgtm

@Semmer2
Copy link
Collaborator

Semmer2 commented Aug 24, 2021

Hi,
The new patches may crushed when using tf backend with dnn_detect. It may coused by null frame of dnn_detect_post_proc. Pleas check this.
The command I used is ./ffmpeg -i dog.jpg -vf dnn_detect=dnn_backend=tensorflow:input=image_tensor:output="num_detections&detection_scores&detection_classes&detection_boxes":model=ssd_mobilenet_v2_coco.pb:backend_configs='async=0',showinfo -y face_detect.jpeg
Please re-check all the dnn filter and backends. :)

@shubhanshu02
Copy link
Contributor Author

I have corrected the patch and tested it with the following commands.

Classification

I tried with these values of async (0,0), (0,1), (1,0), (1,1) where (x,y) = (async value in detect, async value in classify)

ffmpeg -i cici.jpg -vf dnn_detect=dnn_backend=openvino:model=face-detection-adas-0001.xml:backend_configs='async=0':input=data:output=detection_out:confidence=0.6:labels=face-detection-adas-0001.label,dnn_classify=dnn_backend=openvino:model=emotions-recognition-retail-0003.xml:input=data:output=prob_emotion:confidence=0.3:labels=emotions-recognition-retail-0003.label:backend_configs='async=0',showinfo -f null -

Detection

Tried with async=0 and async=1

ffmpeg -i cici.jpg -vf dnn_detect=dnn_backend=openvino:model=face-detection-adas-0001.xml:backend_configs='async=0':input=data:output=detection_out:confidence=0.6:labels=face-detection-adas-0001.label,showinfo -f null -
ffmpeg -i dog.jpg -vf  dnn_detect=dnn_backend=tensorflow:input=image_tensor:output="num_detections&detection_scores&detection_classes&detection_boxes":model=ssd_mobilenet_v2_coco.pb:backend_configs='async=1',showinfo -y face_detect.jpeg

Derain

Tried with async=0 and async=1

ffmpeg -i rain.jpg -vf derain=model=can.pb:dnn_backend=1 derain.jpg -y 
ffmpeg -i rain.jpg -vf derain=model=can.model:dnn_backend=0 derain.jpg -y 

SR filter

Tried with async=0 and async=1

ffmpeg -i 480p.jpg -vf sr=model=espcn.pb:dnn_backend=1 espcn.jpg -y 
ffmpeg -i 480p.jpg -vf sr=model=espcn.model:dnn_backend=0 espcn.jpg -y 

DNN Processing

Tried with async=0 and async=1

ffmpeg -i rain.jpg -vf format=rgb24,dnn_processing=dnn_backend=openvino:model=can.xml:input=x:output=can/sub:backend_configs="input_resizable=1&async=0" -y rain1.jpg
ffmpeg -i rain.jpg -vf format=rgb24,dnn_processing=dnn_backend=tensorflow:model=can.pb:input=x:output=y:backend_configs='async=1' derain.jpg -y
ffmpeg -i rain.jpg -vf format=rgb24,dnn_processing=dnn_backend=native:model=can.model:input=x:output=y:backend_configs='async=0' derain.jpg -y

In the native backend, async is not supported, so async=1 ran in sync mode.

@guoyejun @Semmer2 Sir, can you have a look at the above commands and check if I missed any filter? Thank you.

@Semmer2
Copy link
Collaborator

Semmer2 commented Aug 25, 2021

Hi Shubhanshu,
LGTM. And the code passed my local test. :)

@shubhanshu02
Copy link
Contributor Author

Thanks. I'll send the updated patches to the mailing list.

@guoyejun
Copy link
Collaborator

we may need to merge the last patch into the patch which causes the issue

@shubhanshu02
Copy link
Contributor Author

we may need to merge the last patch into the patch which causes the issue

I think the patch libavfilter: Unify Execution Modes in DNN Filters is the one that caused this issue. When I build on it, it showed the same bug. The one just before it is related to the native backend, so won't be contributing to this problem.

Shall I merge these? (these = libavfilter: Unify Execution Modes in DNN Filters + libavfilter: Send only input frame for DNN Detect and Classify)

@guoyejun
Copy link
Collaborator

we may need to merge the last patch into the patch which causes the issue

I think the patch libavfilter: Unify Execution Modes in DNN Filters is the one that caused this issue. When I build on it, it showed the same bug. The one just before it is related to the native backend, so won't be contributing to this problem.

Shall I merge these? (these = libavfilter: Unify Execution Modes in DNN Filters + libavfilter: Send only input frame for DNN Detect and Classify)

yes, we'd keep every single patch not breaking things.

@shubhanshu02 shubhanshu02 force-pushed the unification branch 3 times, most recently from 19d2108 to c604077 Compare August 25, 2021 21:15
This commit rearranges the code in Native Backend to use the TaskItem
for inference.

Signed-off-by: Shubhanshu Saxena <shubhanshu.e01@gmail.com>
This commit unifies the async and sync mode from the DNN filters'
perspective. As of this commit, the Native backend only supports
synchronous execution mode.

Now the user can switch between async and sync mode by using the
'async' option in the backend_configs. The values can be 1 for
async and 0 for sync mode of execution.

This commit affects the following filters:
1. vf_dnn_classify
2. vf_dnn_detect
3. vf_dnn_processing
4. vf_sr
5. vf_derain

This commit also updates the filters vf_dnn_detect and vf_dnn_classify
to send only the input frame and send NULL as output frame instead of
input frame to the DNN backends.

Signed-off-by: Shubhanshu Saxena <shubhanshu.e01@gmail.com>
This commit removes the unused sync mode specific code from the DNN
filters since the sync and async mode are now unified from the
filters' perspective.

Signed-off-by: Shubhanshu Saxena <shubhanshu.e01@gmail.com>
Remove async flag from filter's perspective after the unification
of async and sync modes in the DNN backend.

Signed-off-by: Shubhanshu Saxena <shubhanshu.e01@gmail.com>
This patch renames the InferenceItem to LastLevelTaskItem in the
three backends to avoid confusion among the meanings of these structs.

The following are the renames done in this patch:

1. extract_inference_from_task -> extract_lltask_from_task
2. InferenceItem -> LastLevelTaskItem
3. inference_queue -> lltask_queue
4. inference -> lltask
5. inference_count -> lltask_count

Signed-off-by: Shubhanshu Saxena <shubhanshu.e01@gmail.com>
Signed-off-by: Shubhanshu Saxena <shubhanshu.e01@gmail.com>
@guoyejun
Copy link
Collaborator

i see you just force-pushed the patches, is it a rebase or something new?

@shubhanshu02
Copy link
Contributor Author

shubhanshu02 commented Aug 27, 2021

i see you just force-pushed the patches, is it a rebase or something new?

Actually, I saw some CI tests failing, so thought to rebase against the latest master. Nothing new added.

@shubhanshu02
Copy link
Contributor Author

Thank you for reviewing @guoyejun @Semmer2 Sir.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants