Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pluralsight content too short #9504

Closed
CollinChaffin opened this issue May 15, 2016 · 4 comments
Closed

Pluralsight content too short #9504

CollinChaffin opened this issue May 15, 2016 · 4 comments

Comments

@CollinChaffin
Copy link

@CollinChaffin CollinChaffin commented May 15, 2016

The Pluralsight extractor seems to be pulling okay but a growing amount of the videos even when highly throttled are producing unfinished ".part" files basically failed videos. This obviously results in having to re-start the list. Out of 36 I think the last run had 7 failures with this error and I still haven't quite gotten all the missed ones. Repeatedly re-running eventually gets the file, but obviously re-running 5+ times to get the missing ones may further risk of drawing attention and a possible ban.

Here's one example of the pertinent error using -Verbose. Let me know how I can help test further.

[download] 14.0% of 8.41MiB at 8.40KiB/s ETA 14:41 ERROR: content too short (e
xpected 8816947 bytes and served 1231463)
Traceback (most recent call last):
File "youtube_dl\YoutubeDL.pyo", line 1643, in process_info
File "youtube_dl\YoutubeDL.pyo", line 1585, in dl
File "youtube_dl\downloader\common.pyo", line 350, in download
File "youtube_dl\downloader\http.pyo", line 236, in real_download

ContentTooShortError

@dstftw
Copy link
Collaborator

@dstftw dstftw commented May 15, 2016

content too short is usually an indication of network issues.

@dstftw dstftw closed this May 15, 2016
@CollinChaffin
Copy link
Author

@CollinChaffin CollinChaffin commented May 16, 2016

Hi is there a way yet to do automatic retries on the content too short errors? The --retries doesn't seem to work if it is just network related I'm okay retrying but cannot seem to find an easy way other than to keep re-processing the entire list and skipping those that were successful. If there is a way to get the --retries XX to literally retry the failed video XX times, that would be awesome!

@CollinChaffin
Copy link
Author

@CollinChaffin CollinChaffin commented May 17, 2016

Hey Sergey you may want to look at re-opening this one and allow community to help collect some additional debug info - I see some others posting that PS has changed something and now a growing number of their playlist will for some reason error in content too short, yet no one (including me) has any other content too short (or any other) network related issues with ydl or anything else. If I get a bit more time I'll try to get a wireshark cap of it since the YDL verbose doesn't really offer any hints, but others are reporting this is somehow a new way for them to then flag those (even throttling) to just watch for the playliist being retried multiple times for a ban but I have not been able to confirm.

@CollinChaffin
Copy link
Author

@CollinChaffin CollinChaffin commented May 18, 2016

Did not yet run any wireshark caps yet, but can confirm that this is in fact caused by setting the option of "--rate-limit" in YTDL. Used 50k as example and a large number from playlist will fail and can easily be reproduced. Removing the ratelimit results in 100% success and adding it back immediately causes failures.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
2 participants
You can’t perform that action at this time.