Output to stdout #190
Output to stdout #190
Comments
|
The big issue is that aria2 can download multiple files simultaneously and out-of-order in its original file position. In that condition, output to stdout is somewhat hard to use. |
|
OK, so it sounds like it would be pretty difficult and possibly in conflict with typical use of the tool. A couple other things I noticed that would make it difficult to implement:
I tried a named pipe (FIFO) approach, but the issue is that aria2 has no way of writing to a file descriptor that is known before the command is executed. Yes, you can make sure that Maybe this isn't the right fit for this particular tool. |
|
I am interested in it. It is possible to determine outside from aria, how much from the "head" of the file is downloaded? Is is possible to to determine it with the controle file of aria2? |
|
@AnselmD Possibly, but I think that if this feature was implemented, it should be implemented internally rather than externally. |
|
I would like to chime another 1c for the ability to specify output file name. ATM, I see no ability to download from multiple URLs into a differently named file. Usecase -- use by git-anex see http://git-annex.branchable.com/todo/extensible_addurl/#comment-40a1d58630f56dd744d56dc56a68770e |
|
To specify different file name for each downloads, currently one has to use -i file option. |
|
On Fri, 05 Dec 2014, Tatsuhiro Tsujikawa wrote:
sorry if I am blind but I don't see an option for specifying the output |
|
For example, suppose you created the following file:
and save it to list.txt. Then |
|
That's the major showstopper for me as well. I think it would be rather easy to determine internally if we're currently downloading single file or multiple and enable "--output2stdout" (or whatever you might call it) only for single-file downloads. There're already options which work differently for single- and multi- file downloads (--download-result for example). |
|
I needed this as well because we were pulling 500GB, up to 1TB snapshot files from cloud storage to a machine which did not have double the physical space. Resumability is not required as if the pipe breaks the whole operation is broken, this does not mean that the underlying process cant keep retrying on parts (and it does this). I threw together https://github.com/icodeforlove/npm_fatpipe. It's only for piping output, so if you are looking for resumable features I'd stick to aria2. As for how the concurrency is handled It downloads multiple parts, and pipes to stdout in order. One of the big issues you will run into is handling back pressure, so having the ability to fine tune this is very useful. Maybe something like this can be added into aria2c, if not you can use fatpipe for this edgecase. |
|
I want to regularyly restore compressed database backups. A drop-in replacement for the |
|
Do you plan to implement this feature so that we can pipe the output to html parsers such as pup ? |
|
Just adding another voice here; I'd like to be able to specify arbitrary streams for my downloads, especially when talking to the daemonized aria2 client. As a quick trial of attemping to set it to write directly to
So the first order of business here is probably a stream-friendly output mode which avoids doing these things. |
aria2 is awesome! However, one thing I am missing is output to stdout, mainly so things like this can work:
I am comfortable with C++, so I have no problem making a pull request to add it. But I wanted to gauge how involved this feature might be. It seems simple, but it might be more complex than I imagine, since all visible output now has to go to stderr or directly to the tty. So I figured I'd ask about it first.
The text was updated successfully, but these errors were encountered: