Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.
Sign upGitHub is where the world builds software
Millions of developers and companies build, ship, and maintain their software on GitHub — the largest and most advanced development platform in the world.
How can naming and post-processing options be applied when aria2 is used to download videos concurrently? #15268
Comments
|
So what do you expect to hear? Spawning youtube-dl process per URL is the most obvious way to parallelize downloads, yes. Alternatively, you handle everything yourself only using youtube-dl as URL a resolver and metadata provider. |
|
I expected to perhaps hear something helpful which I missed. Thanks. |
What is the purpose of your issue?
Description
I want to download videos concurrently and then have youtube-dl write metadata to the videos and format their titles.
As described in #350, aria2 can be used to concurrently download videos. However, using
--external-downloaderwill result in only one video at a time (but aria2 can still split the single video). To fix this, the list of URLs needs to be provided ahead of time. However, youtube-dl's-acan't be used with aria2's-i, as this will result in aria2 downloading the whole list for each video youtube-dl parses.Remarks
aria2 has event hooks that may be of use, particularly
--on-download-complete. Unfortunately, from my understanding of the documentation, it seems to only execute when the entire download completes, not once per video. Even if it did do it once per video, I don't see any way youtube-dl can modify an existing video or how the necessary information (e.g. original link) could be passed back to it so that metadata could be retrieved for the video.I've observed that youtube-dl can detect when a video has already been downloaded. I assume it does this with a simple file name check. I could download all the videos with aria2 and then run youtube-dl passing the list of the original links. However, due to reliance on the file names, I unfortunately wouldn't be able to format them.
Possible Solutions
My current solution is to write a script to start a new youtube-dl process with
--external-downloaderfor every line in a file containing URLs. If anyone sees something horribly wrong with this or has any better ideas, please, do tell.Alternatively, I could use simulation options or
--write-info-jsonto dump metadata to files. Videos would then be downloaded using a single aria2 process. The aforementioned event hook would execute a program I will write which will parse the dumped metadata and write them to the videos. This seems more proper than the first solution, but it's also substantially more work.