Skip to content
This repository has been archived by the owner on Mar 9, 2021. It is now read-only.

Option to grab the url list only #42

Closed
tekko opened this issue Mar 17, 2017 · 2 comments
Closed

Option to grab the url list only #42

tekko opened this issue Mar 17, 2017 · 2 comments

Comments

@tekko
Copy link

tekko commented Mar 17, 2017

While the new _files.tumblr is good, it would be better to have an options to get those links only.

Having only the urls saved, but not the actual files, one can merge those lists outside the program.
Once merged, duplicated links can be cleaned and batch downloaded with a browser's extension saving space and bandwidth.

Just an idea. I've tried something similar by checking to download only the meta files. But those urls are blog specific if I recall correctly.

Thanks.

@Taranchuk
Copy link

Taranchuk commented Mar 17, 2017

I like this idea. This can be introduced in the form an additional download mode for only the URL list as a separate text file. This will also allow for example to check the number of URLs with the number of downloaded files, if any files are missing, then can weed out the downloaded URLs and download the remaining missing files by any downloader.

@johanneszab
Copy link
Owner

Should be possible.

I'm abstracting things right now and outputting everything that was grabbed shouldn't be too hard to do. I'll add that.

Thanks for suggesting!

johanneszab added a commit that referenced this issue Mar 23, 2017
- Adds an option to download an url list instead of the binary files
themselves (#42).
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants