-
Notifications
You must be signed in to change notification settings - Fork 877
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Help with trimming, scaling and concatenating from one input file (memory leak) #817
Comments
Okay so I tried it with another video file, this time without scaling and the same thing happens, here are the logs:
After this point it ate all 16GB of RAM, then the swap file quickly grew to 50GB+, the speed decreased to 1.2x, the system became slow to respond and I tried to shut down the kernel, and I succeeded.
At this point as an alternative I'm thinking maybe I should export the trimmed parts into temporary files and maybe use concat demuxer to put them together again or what would be fastest? |
Okay, so I went with the following: first trim (with seeking) and encode the parts, then use the contact demuxer in a different step to put it all together. Still, if it could be done in one step without the memory leak, it would be very nice, so I'll keep this issue open. Someone might have a suggestion. |
I'm also experiencing this memory leak. My script started failing with the latest ffmpeg release 6.1 but works with older ffmpeg versions. I run a very similar workflow def generate_video_preview(in_filename, out_filename, sample_duration, sample_seconds, scale, format, quiet):
in_file = ffmpeg.input(in_filename)
samples = sample_video(in_file, sample_duration=sample_duration, sample_seconds=sample_seconds)
stream = ffmpeg.concat(*samples)
if scale is not None:
width, height = scale.split(':')
stream = ffmpeg.filter(stream, 'scale', width=width, height=height, force_original_aspect_ratio='decrease')
(
ffmpeg
.output(stream, out_filename, format=format)
.overwrite_output()
.run(quiet=quiet)
) |
See I was sure that earlier when I tried this it worked without a memory leak, but by earlier I mean a couple of weeks ago. Then I changed computers and the exact same thing produced a memory leak. So it must have been the version change. But I would say it was maybe rather 6.1.1 not the 6.1 release. Unfortunately I cannot test on my old computer anymore since it has been wiped. |
I have found the reason for the memory leak on my end. Make sure start and end are actual timestamps or a floating point seconds format, I was using start_frame/end_frame for both trim and filter's atrim which caused these leaks. |
Hello all,
I scoured forums (and issues here) for some time and couldn't find the cause of my problem so I'm opening an issue. For some reason my implementation of a video trimming, downscaling and concatenating script results in a memory leak crashing ffmpeg, Python, and sometimes even my system. See (hopefully all) the relevant code below:
listoftimes
contains the starts and ends of the to-be-trimmed parts in seconds. When ffmpeg didn't crash my system it wrote some error like "Cannot allocate memory". I will try to get the logs but basic information of the video file: mp4, h264, ~35 minutes, 3Mbps, 1280x708. I would trim this down to around 12 minutes from lots of short, hand-selected parts. I was thinking maybe ffmpeg opens the input just as many times as many parts I choose, but since I'm a beginner with ffmpeg and ffmpeg-python I don't quite see what I should change in my code for this to not occur.I would appreciate some help or ideas on what I might be doing wrong. I will be back with the ffmpeg logs if I can just almost crash my PC.
The text was updated successfully, but these errors were encountered: