Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spilt Download Large Galleries #5

Closed
JingJang opened this issue Aug 21, 2015 · 14 comments
Closed

Spilt Download Large Galleries #5

JingJang opened this issue Aug 21, 2015 · 14 comments

Comments

@JingJang
Copy link

Some large galleries do not have torrent and currently E-Hentai Downloader cannot handle them. It just downloads the images and then save dialog does not appear.

A way to download large galleries in segments so you can download them over several days if you want and need due to download limitation.

@ccloli
Copy link
Owner

ccloli commented Aug 21, 2015

Thanks for your suggestion. This feature has been added to plan when 1.15 released (it has been said in Chinese, so you may not know it). The bug is mostly caused by no enough free memory when generating zip file, and browser stopped running script if you are using Firefox.

@ccloli
Copy link
Owner

ccloli commented Aug 21, 2015

By the way, you can use Chrome with Tampermonkey, Chrome may not kill running script when there are no more free memory (in generating zip file, it may need about double size of archive size, and if deflate level is not 0, it needs more), and it usually generates zip file successfull (but notice that Chrome can only handle max 500MiB file).

@Gummar
Copy link

Gummar commented Aug 22, 2015

Are you planning to add a file range selection feature? i.e. download only page 1 and pages 200 ~ 298 because pages 2 ~ 199 are text versions of pages 200 ~ 298.

@ccloli
Copy link
Owner

ccloli commented Aug 22, 2015

@Boxmanbr I'm not sure. This function may be add ed in the future, and I also thought about it, but may not right now. The way to get next page's URL is to fetch the previous page in current version (also get image's URL), so this will waste your images viewing limits if you just want to get some of these images. And it also makes action box overflow the archive info frame (it is now overflowing in g.e-hentai.org).

@iShyboy
Copy link

iShyboy commented Aug 23, 2015

Oh, so you could add a button to pause the download too, as for the pages, something like you do with the impression of files like 1-20,27,31-39 would be cool but hard to make. We have this function on batch downloads on irc, and it's great.

Edit: Oh, yes, you could make the "download archive" become "pause" . Becouse i have a bad experience of clicking 2 times in the 'download archive' and it was a waste of band, I just noticed it after the dowload.

@ccloli
Copy link
Owner

ccloli commented Aug 23, 2015

@iShyboy Thanks for your suggestions. Pause downloading is a good idea, I'll try working it out. For downloading some images of archive, this may a bit hard, but can do it. But just as what I said in the early morning, current way to get images' URL will waste user's images viewing limits, so I need to rewrite it. And there is no spacing to add an input box (in fact, it's now overflowing in g.e-hentai.org), so I also need to consider where to put it.
The field is now overflowing in g.e-hentai.org

@ccloli
Copy link
Owner

ccloli commented Aug 23, 2015

@iShyboy I thought that again, and pausing downloading is not so important now, and the amount of code is a little much, so I'm sorry for it may not be supported in recent releases.

@iShyboy
Copy link

iShyboy commented Aug 23, 2015

Well, it's not a problem, if you could fix that of 2 clicks fetch and download 2 times would be already great.

I have the idea but if it's hard or no I dont know, but take your time, it's not in urgent need anyway.
E-hentai highlighter have the answer to space inssue.

rseti9x

tt3bpyp

@ccloli
Copy link
Owner

ccloli commented Aug 23, 2015

OK, this would be fixed in next version. Sorry again for cannot satisfy your requirement.

@ccloli
Copy link
Owner

ccloli commented Aug 23, 2015

I also thought about this UI, but it seems that the button would be a little distracting. Anyway, thanks for your advice.

ccloli added a commit that referenced this issue Aug 28, 2015
 ),可在设置内指定当下载指定页面且启用编号图片功能时使用原始页码进行编号,将操作框移至画集介绍框下方,当重复执行下载时弹出确认提醒(同时修复重复执行下载时重复获取图片的 bug,Thanks to @iShyboy , see issue #5 )

Allow download specified pages of archive (Thanks to @JingJang , @Boxmanbr and @iShyboy , see issue #5 ), and you can number images with their original page numbers on settings if you set pages range, moved action box after information box, confirm re-download when clicking "Download Archive" again (also fixed fetching images repeatedly when re-downloading, thanks to @iShyboy , see issue #5 )
@JingJang
Copy link
Author

I would like to say thank you very much for your work. So if I forget to say it, let this count. 😀

@ccloli
Copy link
Owner

ccloli commented Aug 31, 2015

@iShyboy

If you mean that pausing download like Mega, I tested it just now, and the answer is there is no way to pause download.

I found an answer from StackOverflow: Pause a download with XMLHttpRequest in Javascript, and I tried it, but it seems that there is no access to get responded data. Here is my test code:

var xhr = new XMLHttpRequest();
xhr.open('GET', '/content.txt'); // content.txt is hosted on server
xhr.responseType = 'arraybuffer'; // set responseType as arraybuffer to get binary data, but whatever it set, it doesn't change the answer
xhr.onreadystatechange = function(){
    var readyStateText;
    switch (xhr.readyState) {
        case 0: // doesn't send
            readyStateText = 'UNSENT';
            break;
        case 1: // open an XHR request
            readyStateText = 'OPENED';
            break;
        case 2: // connect to server successfully and start receiving header meta
            readyStateText = 'HEADERS_RECEIVED';
            break;
        case 3: // header is received and start receiving content 
            readyStateText = 'LOADING';
            break;
        case 4: // content is received
            readyStateText = 'DONE';
            break;
    }
    console.log('readyState:', readyStateText, ', Response:', xhr.response); // show info in console
}
xhr.send();

And here is log:

readyState: HEADERS_RECEIVED , Response: null
readyState: LOADING , Response: null
readyState: LOADING , Response: null
...... // all of these are "readyState: LOADING , Response: null"
readyState: LOADING , Response: null
readyState: LOADING , Response: null
readyState: DONE , Response: ArrayBuffer {}

So you see, we can't get responded data when receiving.

I checked the way to pause download of Mega. Mega split a file into so many parts, and when downloading, Mega will fetch some of these parts, not the whole file. So the content which has been downloaded will be saved. If we pause download, the fetching parts will be throw, then we resume download, they will be refetched. The image on E-Hentai is not so large, so if use this way to download them, most of downloading time will be wasted on connecting to server.

@iShyboy
Copy link

iShyboy commented Sep 10, 2015

No need, the alert you make some time ago is already enough.

@ccloli
Copy link
Owner

ccloli commented May 23, 2017

Hi, sorry for the inconvenience of replying to this issue. I'm considering about split the archive by file size (not only by pages range), like storing each archive parts no more than 100 MB. Thought not sure when to work on it (yes it's really a bit long since this issue), here are some questions about it, but I think the core question is Should we need the correct order of images / archives. If you are interested in it, check out #57 to join the discussion or subscribe it to get our latest discussion. 😸

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants