Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how can I download all videos on a page? #3450

Open
ram1123 opened this issue Aug 5, 2014 · 7 comments
Open

how can I download all videos on a page? #3450

ram1123 opened this issue Aug 5, 2014 · 7 comments

Comments

@ram1123
Copy link

@ram1123 ram1123 commented Aug 5, 2014

I need to download all videos on the following link.

http://cds.cern.ch/search?ln=en&cc=Videos&p=recid%3A1125472&action_search=Search&op1=a&m1=a&p1=&f1=&c=Videos&c=&sf=&so=d&rm=wrd&rg=100&sc=0&of=hb

How can I download this?

Thanks in advance.

with regards,
Ramkrishna

@rg3
Copy link
Collaborator

@rg3 rg3 commented Aug 6, 2014

That page appears to display 100 results. I think you can extract the URLs for the individual videos with a simple grep, save the URL list to a file and use it with youtube-dl and its -a and -w options.

wget -q -O - 'http://cds.cern.ch/search?ln=en&cc=Videos&p=recid%3A1125472&action_search=Search&op1=a&m1=a&p1=&f1=&c=Videos&c=&sf=&so=d&rm=wrd&rg=100&sc=0&of=hb' | grep -o 'http://cds\.cern.ch/record/[0-9]\+' | sort -u >urls.txt
youtube-dl -w -a urls.txt

And then erase urls.txt.

@ram1123
Copy link
Author

@ram1123 ram1123 commented Aug 6, 2014

Thank you very much. It worked and I learn something new. :D

@ram1123 ram1123 closed this Aug 6, 2014
@phihag
Copy link
Contributor

@phihag phihag commented Aug 6, 2014

Reopening, we should add support for cds.cern.ch so that it's interpreted as a playlist natively.

@phihag phihag reopened this Aug 6, 2014
@birkanatici
Copy link

@birkanatici birkanatici commented Aug 11, 2014

i dont understand.

@ram1123
Copy link
Author

@ram1123 ram1123 commented Aug 12, 2014

@phihag This is great Idea... Thanks.

@gamecoderz Actually phihag want to say that on the website of cern if there are more than one video it will take them as playlist and download all at once.

@birkanatici
Copy link

@birkanatici birkanatici commented Aug 12, 2014

@ram1123 i understand your question but i dont understand the saying @rg3 . how can i ' youtube-dl -w -a urls.txt ' run this command ? Does the code of linux system ?

@pvdl
Copy link
Contributor

@pvdl pvdl commented Sep 2, 2014

@ram1123,

wget -q -O - 'http://cds.cern.ch/search?ln=en&cc=Videos&p=recid%3A1125472&action_search=Search&op1=a&m1=a&p1=&f1=&c=Videos&c=&sf=&so=d&rm=wrd&rg=100&sc=0&of=hb' | grep -o 'http://cds\.cern.ch/record/[0-9]\+' | sort -u >urls.txt
youtube-dl -w -a urls.txt

is a linux command.
In fact this command uses wget, grep, sort and youtube-dl commands in one go.
It downloads the whole webpage, searches for the urls, sorts them and parses the output to youtube-dl

@phihag has reopened the issue because their is no exclusive extractor for http://cds.cern.ch

For one video to download the url should have the format like this example:
http://cds.cern.ch/record/1610170

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
5 participants
You can’t perform that action at this time.