Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can naming and post-processing options be applied when aria2 is used to download videos concurrently? #15268

Closed
MarkKoz opened this issue Jan 16, 2018 · 2 comments

Comments

@MarkKoz
Copy link

@MarkKoz MarkKoz commented Jan 16, 2018

What is the purpose of your issue?

  • Bug report (encountered problems with youtube-dl)
  • Site support request (request for adding support for a new site)
  • Feature request (request for a new functionality)
  • Question
  • Other

Description

I want to download videos concurrently and then have youtube-dl write metadata to the videos and format their titles.

As described in #350, aria2 can be used to concurrently download videos. However, using --external-downloader will result in only one video at a time (but aria2 can still split the single video). To fix this, the list of URLs needs to be provided ahead of time. However, youtube-dl's -a can't be used with aria2's -i, as this will result in aria2 downloading the whole list for each video youtube-dl parses.

Remarks

aria2 has event hooks that may be of use, particularly --on-download-complete. Unfortunately, from my understanding of the documentation, it seems to only execute when the entire download completes, not once per video. Even if it did do it once per video, I don't see any way youtube-dl can modify an existing video or how the necessary information (e.g. original link) could be passed back to it so that metadata could be retrieved for the video.

I've observed that youtube-dl can detect when a video has already been downloaded. I assume it does this with a simple file name check. I could download all the videos with aria2 and then run youtube-dl passing the list of the original links. However, due to reliance on the file names, I unfortunately wouldn't be able to format them.

Possible Solutions

My current solution is to write a script to start a new youtube-dl process with --external-downloader for every line in a file containing URLs. If anyone sees something horribly wrong with this or has any better ideas, please, do tell.

Alternatively, I could use simulation options or --write-info-json to dump metadata to files. Videos would then be downloaded using a single aria2 process. The aforementioned event hook would execute a program I will write which will parse the dumped metadata and write them to the videos. This seems more proper than the first solution, but it's also substantially more work.

@MarkKoz MarkKoz changed the title How Can Naming and Post-processing Options Be Applied when aria2 is Used to Download Videos Concurrently? How can naming and post-processing options be applied when aria2 is used to download videos concurrently? Jan 16, 2018
@dstftw
Copy link
Collaborator

@dstftw dstftw commented Jan 16, 2018

So what do you expect to hear? Spawning youtube-dl process per URL is the most obvious way to parallelize downloads, yes. Alternatively, you handle everything yourself only using youtube-dl as URL a resolver and metadata provider.

@dstftw dstftw closed this Jan 16, 2018
@MarkKoz
Copy link
Author

@MarkKoz MarkKoz commented Jan 16, 2018

I expected to perhaps hear something helpful which I missed. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
2 participants
You can’t perform that action at this time.