Skip to content

Slow Inference Pipeline with M3U8 Format #1160

Open
@emma-smashvision

Description

@emma-smashvision

Search before asking

  • I have searched the Inference issues and found no similar feature requests.

Question

Hi,

Introduction
While working with the inference pipeline, I had a significant performance problem when processing M3U8 (HTTP Live Streaming) format files. Unlike MP4 files which process smoothly, M3U8 inputs seem to cause skipping of frames in the pipeline. Could this be investigated? It's needed for all live-streaming services and I would really like to use it for my workflow as well.

I am not testing live, so I have a video (which plays as a livestream at 25 fps in m3u8 format) and when I manually use openCV to look at the frames and process them everything works and looks fine. When I use the inference pipeline, it starts skipping a lot of the frames. I also tried different max_fps (from 1 to 100), but the problem stays.

Question
How can I make sure that none of the frames from my m3u8 stream are skipped using the inference pipeline?

My code is the following:

self.pipeline = InferencePipeline.init_with_workflow(
            api_key=self.api_key,
            workspace_name=self.workspace_name,
            workflow_id=self.workflow_id,
            video_reference=video_url,
            max_fps=100,
            on_prediction=self.process_frame
        )
       
self.pipeline.start()
self.pipeline.join()

Any help would be appreciated!

With kind regards,
Emma

Additional

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions