You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sometimes there are shorter fades-through-black in the middle of a scene and you don't want to introduce cut points at them. Currently the process_frame method of ThresholdDetector will introduce a scene cut any time the frame average dips below the threshold.
Proposed Implementation:
Add an optional argument --min-cut-len=[time] to detect-threshold that will skip (that is, not cut at) fades-through-black that are shorter than time. If time is an integer, interpret it as a number of frames; if time is a floating-point number with s appended, interpret it as a number of seconds; if time is a time code, interpret it as a hour:minute:second duration.
elif self.last_fade['type'] == 'out' and frame_avg >= self.threshold:
to something along the lines of
elif self.last_fade['type'] == 'out' and (frame_num - self.last_fade['frame'] > self.min_cut_len) and frame_avg >= self.threshold:
Alternative Solutions:
Setting the minimum scene length with --min-scene-len or -m will sometimes have the desired effect, but only coincidentally. If a 20-minute scene has a half-second fade five minutes from the beginning, setting -m=360s will not cut at that point. But setting this minimum scene length too high may miss the longer fades and effectively clump different parts of different scenes together.
The text was updated successfully, but these errors were encountered:
Thanks for the detailed description. For some background, by far, the biggest issues with the existing SceneDetector interface is that all detectors can only output cuts currently. Going forwards to v1.0, I want to change the API so that detectors can output different types of events, e.g. CUT, IN/START, and OUT/END events. You could then add filters to the event list, e.g. filter consecutive OUT/IN events closer than min_cut_len as a post-processing step. That's still quite a ways off however.
In the meantime, your solution seems like a very reasonable approach to get around this issue. I would suggest a different name for the argument though, e.g. --min-out-length or something, just because the term cuts might be a bit confusing when dealing with fade in/fade out events.
Edit: I don't have a specific timeline for adding this, but happy to accept any PRs to the upcoming version (v0.6.1) if someone wants to add this in early. Otherwise will schedule this in for one of the following releases.
Description of Problem & Solution
Sometimes there are shorter fades-through-black in the middle of a scene and you don't want to introduce cut points at them. Currently the
process_frame
method ofThresholdDetector
will introduce a scene cut any time the frame average dips below the threshold.Proposed Implementation:
Add an optional argument
--min-cut-len=[time]
todetect-threshold
that will skip (that is, not cut at) fades-through-black that are shorter than time. If time is an integer, interpret it as a number of frames; if time is a floating-point number withs
appended, interpret it as a number of seconds; if time is a time code, interpret it as a hour:minute:second duration.I think all that would have to be done is to set a property
min_cut_len
and change line line 149 ofscenedetect.detectors.threshold_detector
fromto something along the lines of
Alternative Solutions:
Setting the minimum scene length with
--min-scene-len
or-m
will sometimes have the desired effect, but only coincidentally. If a 20-minute scene has a half-second fade five minutes from the beginning, setting-m=360s
will not cut at that point. But setting this minimum scene length too high may miss the longer fades and effectively clump different parts of different scenes together.The text was updated successfully, but these errors were encountered: