Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.
Sign upGitHub is where the world builds software
Millions of developers and companies build, ship, and maintain their software on GitHub — the largest and most advanced development platform in the world.
youtube-dl's URL SOMETIMES returns 404 error when downloaded with FFmpeg #17903
Comments
|
Also, it won't return direct download URL in cases when no such URL is available. |
|
Makes sense, but why then would this happen only sometimes, though? I think it's only happening to DASH files. |
|
It may happen for segmented DASH when the format you've selected happens to be segmented DASH. |
Make sure you are using the latest version: run
youtube-dl --versionand ensure your version is 2018.10.05. If it's not, read this FAQ entry and update. Issues with outdated version will be rejected.x] I've verified and I assure that I'm running youtube-dl 2018.10.05Before submitting an issue make sure you have:
x] At least skimmed through the README, most notably the FAQ and BUGS sectionsx] Searched the bugtracker for similar issues including closed onesx] Checked that provided video/audio/playlist URLs (if any) are alive and playable in a browserWhat is the purpose of your issue?
x] QuestionYoutube-dl works OK. What I'm doing is using the URL provided by youtube-dl. It is fed to ffmpeg which spits out an error. I wouldn't have posted this here, but I don't know where else.
Here's my workflow:
with ydl:info_dict = ydl.extract_info(youtubevideo, download=False)url= info_dict.get('url')duration = info_dict.get('duration')[https @ 000001f5c1f3cc40] HTTP error 404 Not Found VERY_LONG_VIDEO_URL: Server returned 404 Not Foundffmpy.FFRuntimeError:ffmpeg -y -i VERY_LONG_VIDEO_URL -ss 7.26 -vframes 1 -q:v 2 x.jpgexited with status 1The 404 error happens only for some videos and I wasn't able to pinpoint the exact variable resulting in the error. At first I thought it was some sort of an IP-based limit system, but the error is thrown even after long breaks. I've checked; the issue happens with multiple different files. I've tried just getting the url again, using proxies to get the url again and try again. It's very frustrating and sneaky since it happens very rarely.