Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

youtube-dl autorestart on "content too short" exception #809

Closed
ghost opened this issue Apr 28, 2013 · 25 comments
Closed

youtube-dl autorestart on "content too short" exception #809

ghost opened this issue Apr 28, 2013 · 25 comments

Comments

@ghost
Copy link

@ghost ghost commented Apr 28, 2013

Can i somehow make downloader just restart when errors occurred?
Or can you add this function to the program?
Because you know - to see every single percent of downloading that there "content too short", or "did not received any data of block" or something like that, and after that manually restating the program hundreds of times per single file is boooooring, really boring.
Thanks you for your support and making this program. :)

@szunyi
Copy link

@szunyi szunyi commented Apr 29, 2013

i got this error sometimes too from youtube
usually when the download speed is very slow

@ghost
Copy link
Author

@ghost ghost commented May 5, 2013

actually it doesn't matter which speed do you have.
i've got this error also then i've 20-24 mbit/second speed

@szunyi
Copy link

@szunyi szunyi commented May 6, 2013

if this true, a "retry" parameter can be good
example -retry 5

then if we got a "content too short" error, the program is retry (resume) the downloading

@szunyi
Copy link

@szunyi szunyi commented May 8, 2013

still get this error in the latest update version :((

@RonJohn2
Copy link

@RonJohn2 RonJohn2 commented May 23, 2013

Does the retry counter reset after every "partial success", or is it an "absolute" number?

By this, I mean: even a partial ("content too short") download should not decrement the --retries counter if some progress is made.

Or have a separate retry counter for "content too short" errors.

@RonJohn2
Copy link

@RonJohn2 RonJohn2 commented May 24, 2013

Does "--retry" work with "content too short" errors? With "-R 10" it did not completely d/l an flv last night.

@phihag
Copy link
Contributor

@phihag phihag commented May 24, 2013

Does "--retry" work with "content too short" errors? With "-R 10"
it did not completely d/l an flv last night.

No, that's the point of this issue, to make it work with these errors.

@pushcx
Copy link

@pushcx pushcx commented Dec 14, 2015

As a workaround, I run youtube-dl with:

until youtube-dl --options http://example.com; do sleep 5; done

This re-runs youtube-dl until it succeeds in downloading all videos. The sleep is in there for two reasons: if something else causes youtube-dl to exit immediately we don't want to hammer the remote server, and you can't ctrl-c a few times to cancel without it.

@koenvanderdrift
Copy link

@koenvanderdrift koenvanderdrift commented Apr 18, 2016

That gives me: youtube-dl: error: no such option: --options

@pvasilev
Copy link

@pvasilev pvasilev commented Apr 19, 2016

Isn't it possible for youtube-dl to recognize the error and quit with specific error code, so that wrappers will know what to do? It now quits with error code 1 for Content too short, when it cannot encode the final movie for some reason, and for example if interrupted with Ctrl-C. I haven't tried to simulate other errors, so I do not know whether there are special designated error codes for them. It would be good to have different exit codes for different errors - will make the task of the wrappers a lot easier. Thanks.

@yan12125
Copy link
Collaborator

@yan12125 yan12125 commented Apr 19, 2016

Isn't it possible for youtube-dl to recognize the error and quit with specific error code, so that wrappers will know what to do?

Requested in #882.

@pvasilev
Copy link

@pvasilev pvasilev commented Apr 19, 2016

Machine-readable errors #2913 also kinda related.

@LeviSchuck
Copy link

@LeviSchuck LeviSchuck commented May 8, 2016

This is especially troubling for playlists. It is happening nearly every time for me in the middle of a 200 (max count) sized playlist, my while true; do ... ; done is slowly achieving it, but each failure means I hit youtube for each of the preceding videos each time.

@RonJohn2
Copy link

@RonJohn2 RonJohn2 commented May 8, 2016

youtube-dl uses .part files for incomplete files. Can you use their
non-existence (or the existence of .mp4 files) to skip over successfully
downloaded files?

On 05/07/2016 08:07 PM, Levi wrote:

This is especially troubling for playlists. It is happening nearly every
time for me in the middle of a 200 (max count) sized playlist, my while
true; do ... ; done is slowly achieving it, but each failure means I hit
youtube for each of the preceding videos each time.


You are receiving this because you commented.
Reply to this email directly or view it on GitHub
#809 (comment)

"I compare what the data tells me. I don't do things by votes or authority."
Lawrence Krauss

@LeviSchuck
Copy link

@LeviSchuck LeviSchuck commented May 8, 2016

@RonJohn2 I'm trying out -i to skip errors, and then I'll have it loop over and try to redownload all the error'd incomplete vids over time. Hopefully this will lessen the ridiculous burden.

@anon54
Copy link

@anon54 anon54 commented Aug 28, 2016

@LeviSchuck Do you have any syntax available that you have created for this?

@LeviSchuck
Copy link

@LeviSchuck LeviSchuck commented Aug 29, 2016

@anon54 I believe it was replicated by using youtube-dl on a big playlist like https://www.youtube.com/playlist?list=PL3BDB159758919C4D

@fazzolini
Copy link

@fazzolini fazzolini commented Oct 13, 2016

@koenvanderdrift I think in that script --options means whatever options you use with youtube-dl. You can completely omit the --options part and this workaround will work nicely.

@pjobson
Copy link

@pjobson pjobson commented Jan 10, 2017

Bug(?) still exists, you can duplicate with the newest version (2017.01.08) all day long on dailymotion.

I made a batch file to get these two videos, dailymotion throttles to about 70KiB/s. These work well for example, as they are fairly long (17 min) and will most likely fail.

youtube-dl http://www.dailymotion.com/video/x52jnsp_anthony-bourdain-a-cook-s-tour-s01e13-the-cook-who-came-in-from-the-cold_tv
youtube-dl http://www.dailymotion.com/video/x52oiwj_anthony-bourdain-a-cook-s-tour-s01e16-puebla-where-good-cooks-come-from_tv

Now and then the download stalls and kicks back something like, note that the ERROR does not have a carriage return in front of it, this is how it appears on the screen.

[download]  84.6% of 58.71MiB at 59.80KiB/s ETA 02:34ERROR: content too short (expected 61559462 bytes and served 52095768)

My work around is a bit convoluted but does eventually get the result I want. I basically just re-run the batch file over and over until all the episodes are complete, now and then removing completed files from the batch file.

[download] whatever-xxxxxxx.mp4 has already been downloaded
[download] 100% of 58.24MiB
@vxbinaca
Copy link
Contributor

@vxbinaca vxbinaca commented May 24, 2017

Auto-retries might be nice but couldn't we just run the command with --continue and then hit Up and then Enter to re-run the command and it will restart where it snapped off?

@pushcx
Copy link

@pushcx pushcx commented May 24, 2017

Yes. This issue is not a bug report that manual retrying is broken, it's a feature request for auto-retry and a discussion of workarounds for when manual retrying is basically unworkable. We've had unreliable networks or overload servers that mean hundreds of disconnects in a single download. That's a long, tedious process to manually babysit.

@Joshfindit
Copy link

@Joshfindit Joshfindit commented Jun 24, 2017

Seems like a decent approach would be building in support for --retry --continue

This should deal with

  • Server errors
  • Client errors
  • Poor connection (packetloss, intermittent connection, and/or slow speed)

One edge case I would add

  • Server throttles down to nothing but does not drop connection.
    In this case, a method would be needed to either watch and make sure there are bytes currently being passed and retry if not, or a simple timer (if >#s assume timed out and retry)
@AntonKrug
Copy link

@AntonKrug AntonKrug commented Aug 22, 2017

For example on the Day 5 (talking about RiscV hartid) video of this list:

https://www.youtube.com/watch?v=YDjYsYDBtXs&list=PLExHkW81k6eboVT3nkOvmSF6LLWFWwWdv

I get this error pretty often, wouldn't default retry more desired. Plus on slightly similar topic, should error like his stop downloading the whole list? Wouldn't better to continue with others in the list? That might depend on the type of error. If no space on hdd there is no point, but if it's just a server error for that video and not whole YT down then continuing should be better?

@pjobson
Copy link

@pjobson pjobson commented Aug 22, 2017

Kind of found this happens when I'm opening many connections to YouTube or have batched a bunch, after getting a bunch of things it starts to fail. If I run a batch and get some fails then try again a couple hours later, I don't get the error any more. I wonder if it is a rate limiting "feature" on YouTube to buffer automated requests.

@AntonKrug
Copy link

@AntonKrug AntonKrug commented Aug 23, 2017

This is only after 5. Probably just the infrastructure is not perfect. What the Google guys were saying in the open day when I was visiting, that they are not aiming for 100% perfect because of that too high cost. And this could be the case?

For sure some retry (maybe wait 30 seconds and 2-3 retries by default) and continuing in the list in case of server error (if it's local error/ no space then no point to continue).

If there is a list which got faulty a video and you don't own the list, but you still interested to get as much as possible from it, you don't want to stop in the middle of the list. So I think continuing by default should be preferred.

@dstftw dstftw closed this in a3c3a1e Aug 26, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
You can’t perform that action at this time.