Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add option to detect-threshold to ignore shorter scene boundaries #278

Open
leingang opened this issue Jul 17, 2022 · 2 comments
Open

Add option to detect-threshold to ignore shorter scene boundaries #278

leingang opened this issue Jul 17, 2022 · 2 comments

Comments

@leingang
Copy link
Contributor

Description of Problem & Solution

Sometimes there are shorter fades-through-black in the middle of a scene and you don't want to introduce cut points at them. Currently the process_frame method of ThresholdDetector will introduce a scene cut any time the frame average dips below the threshold.

Proposed Implementation:

Add an optional argument --min-cut-len=[time] to detect-threshold that will skip (that is, not cut at) fades-through-black that are shorter than time. If time is an integer, interpret it as a number of frames; if time is a floating-point number with s appended, interpret it as a number of seconds; if time is a time code, interpret it as a hour:minute:second duration.

I think all that would have to be done is to set a property min_cut_len and change line line 149 of scenedetect.detectors.threshold_detector from

        elif self.last_fade['type'] == 'out' and frame_avg >= self.threshold:

to something along the lines of

        elif self.last_fade['type'] == 'out' and (frame_num - self.last_fade['frame'] > self.min_cut_len) and frame_avg >= self.threshold:

Alternative Solutions:

Setting the minimum scene length with --min-scene-len or -m will sometimes have the desired effect, but only coincidentally. If a 20-minute scene has a half-second fade five minutes from the beginning, setting -m=360s will not cut at that point. But setting this minimum scene length too high may miss the longer fades and effectively clump different parts of different scenes together.

@Breakthrough
Copy link
Owner

Breakthrough commented Jul 17, 2022

Thanks for the detailed description. For some background, by far, the biggest issues with the existing SceneDetector interface is that all detectors can only output cuts currently. Going forwards to v1.0, I want to change the API so that detectors can output different types of events, e.g. CUT, IN/START, and OUT/END events. You could then add filters to the event list, e.g. filter consecutive OUT/IN events closer than min_cut_len as a post-processing step. That's still quite a ways off however.

In the meantime, your solution seems like a very reasonable approach to get around this issue. I would suggest a different name for the argument though, e.g. --min-out-length or something, just because the term cuts might be a bit confusing when dealing with fade in/fade out events.

Edit: I don't have a specific timeline for adding this, but happy to accept any PRs to the upcoming version (v0.6.1) if someone wants to add this in early. Otherwise will schedule this in for one of the following releases.

@leingang
Copy link
Contributor Author

Glad you like the suggestion. I wasn't totally happy with the option name either. min-out-length is better.

As you can see, I got pretty close to taking a stab at it. I'll give it a shot if I can find the time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants