Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.
Sign upGitHub is where the world builds software
Millions of developers and companies build, ship, and maintain their software on GitHub — the largest and most advanced development platform in the world.
DiscoveryGo while using cookies #11219
Comments
|
Re-export cookies and try again. |
|
I have re-exported multiple times, and just tried again and received the same errors. |
|
I, too, have verified this issue. I exported my cookies twice.
|
|
Just FYI... there are some "unlocked" episodes on DiscoveryGo where the m3u8 can be extracted without a logon or cookies. https://www.discoverygo.com/killing-fields/a-body-in-the-bayou/
|
|
Inside of the data-video JSON, there is unescaped XML under "adParameters" keys causing the JSON parser to barf. The XML root appears consistent, "VAST", and if you peel it out of the JSON string, things suddenly work (for me). The hack I have:
|
|
Thanks, @KyleBS. Would you mind posting a copy of your Discoverygo.py file? Unfortunately, I'm on Windows and fairly new to this whole "git" thing, so I haven't been able to successfully integrate your changes. |
|
Posted as .txt to appease github, be sure to change the extension back to .py! |
|
Could anyone paste a dump of the breaking XML string? |
|
Thanks @KyleBS. It still isn't working for me on Windows. Oh well, I will just wait until its fixed. Until then, its easy enough to get the m3u8 from the page and download it. |
|
Is that something I can help you with? Is there a command line I should run? |
|
Add |
|
Hopefully this gives you what you asked for. |
|
Does things work with this change? diff --git a/youtube_dl/extractor/discoverygo.py b/youtube_dl/extractor/discoverygo.py
index c4e83b2c3..b4d686a09 100644
--- a/youtube_dl/extractor/discoverygo.py
+++ b/youtube_dl/extractor/discoverygo.py
@@ -49,7 +49,7 @@ class DiscoveryGoIE(InfoExtractor):
webpage, 'video container'))
video = self._parse_json(
- unescapeHTML(container.get('data-video') or container.get('data-json')),
+ container.get('data-video') or container.get('data-json'),
display_id)
title = video['name'] |
|
Yes...that works!!! Thank you! |
|
Thanks @StevenDTX. Which video(s) were you testing against? Also, I'd like to hear more feedback before commiting it. @KyleBS Does that work for you? |
|
I downloaded a couple episodes of Gold Rush and several Alaska the Last Frontier. It also tested fine on ScienceChannelgo.com and on AnimalPlanetgo.com |
|
@yan12125 Can confirm, this solution worked for me as well, nice catch! |
|
I remember there are some errors in "unlocked" videos mentioned in #11219 (comment). However, they are now all subscribers-only, so I can't reproduce the error. Closing this first and feel free to open new issues if there's any problem. |
|
Thanks, @yan12125 |
Please follow the guide below
xinto all the boxes [ ] relevant to your issue (like that [x])Make sure you are using the latest version: run
youtube-dl --versionand ensure your version is 2016.11.18. If it's not read this FAQ entry and update. Issues with outdated version will be rejected.Before submitting an issue make sure you have:
What is the purpose of your issue?
The following sections concretize particular purposed issues, you can erase any section (the contents between triple ---) not applicable to your issue
If the purpose of this issue is a bug report, site support request or you are not completely sure provide the full verbose output as follows:
Add
-vflag to your command line you run youtube-dl with, copy the whole output and insert it here. It should look similar to one below (replace it with your log inserted between triple ```):If the purpose of this issue is a site support request please provide all kinds of example URLs support for which should be included (replace following example URLs by yours):
Description of your issue, suggested solution and other information
I have been using this method without issue and recently stopped working. I am in the US and have no issues watching the video at the link provided.