Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Retry if a download keeps pending for too long #16

Closed
xiaokangwang opened this issue Jan 5, 2016 · 5 comments · Fixed by #17
Closed

Retry if a download keeps pending for too long #16

xiaokangwang opened this issue Jan 5, 2016 · 5 comments · Fixed by #17
Labels

Comments

@xiaokangwang
Copy link
Contributor

Is there an option to consider a download fail after keep pending for a certain amount of time.

I sometime encounter the problem that some of image failed to fetch(probable server down as they are host by other peers).

I have to refresh the page to retry download, which cost a lot of Image Limits especially when downloading a gallery which consist of a large amount of images.

It seems that this problem associated with an script error:

Log

Failed to load resource: net::ERR_BLOCKED_BY_CLIENT  // (PURPOSEFULLY)
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT // (PURPOSEFULLY)
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT // (PURPOSEFULLY)
[EHD] E-Hentai Downloader is running.
[EHD] Bugs Report > https://github.com/ccloli/E-Hentai-Downloader/issues | https://greasyfork.org/scripts/10379-e-hentai-downloader/feedback
[EHD] To report a bug, showing all the "[EHD]" logs is wonderful. =w=
[EHD] UserAgent > Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.80 Safari/537.36
[EHD] Script Handler > Tampermonkey
[EHD] GreaseMonkey / Tampermonkey Version > 3.12.58
[EHD] E-Hentai Downloader Version > 1.18.6
[EHD] E-Hentai Downloader Setting > {"thread-count":5,"timeout":30,"number-images":false,"number-real-index":false,"force-resized":false,"never-new-url":false,"never-send-nl":false,"store-in-fs":false}
[EHD] Current URL > http://g.e-hentai.org/g/(^_*)
[EHD] Is Logged In > true
[EHD] Index > 3  | RealIndex > 3  | Name > 02.jpg  | RetryCount > 0  | DownloadedCount > 1  | FetchCount > 5  | FailedCount > 0
(omit)
[EHD] Index > 19  | RealIndex > 19  | Name > 18.jpg  | RetryCount > 0  | DownloadedCount > 22  | FetchCount > 5  | FailedCount > 0
Uncaught TypeError: Cannot read property 'length' of undefined
[EHD] Index > 23  | RealIndex > 23  | Name > 22.jpg  | RetryCount > 0  | DownloadedCount > 23  | FetchCount > 5  | FailedCount > 0
(omit)
[EHD] Index > 29  | RealIndex > 29  | Name > 28.jpg  | RetryCount > 0  | DownloadedCount > 30  | FetchCount > 2  | FailedCount > 0

==Err trackback:

Uncaught TypeError: Cannot read property 'length' of undefinedfetchThread.(anonymous function).GM_xmlhttpRequest.onload @ VM7682:10108
(anonymous function) @ VM7676:59

VM7682:10108:
response: new ArrayBuffer(res.responseText.length),

==Debug Information

WTF?

JSON.stringify(res) is
{"readyState":4,"responseHeaders":"Date: Tue, 05 Jan 2016 13:43:56 GMT\r\nContent-Length: 0\r\nContent-Type: text/plain; charset=utf-8\r\n","finalUrl":"http://g.e-hentai.org/fullimg.php?(^_*)","status":500,"statusText":"Internal Server Error","responseType":"arraybuffer","response_types":{"response":false,"responseText":false,"responseXML":false}}

so res.response is undefined while

!res.response is true!

res.response==undefined is also true.

!(res.response==undefined) is false. (expected behavior?)

I will send you a pull request later if I can solve this myself.

@xiaokangwang
Copy link
Contributor Author

Update: !(res.response==undefined) isn't a expected behavior maybe !res.response is used to know whether to read responseText or response.

Reason discovered: My proxy software(goproxy) use HTTP proxy protocol,it will respond 500 as http response code if it encountered a problem(which frequently happen, but not always).

I will send a pull request, however, it is unlikely to effect majority of user as long as they are not running the same proxy as me.

@ccloli
Copy link
Owner

ccloli commented Jan 9, 2016

Wow, it's so strange that both response and responseText are not defined in res, so the function throw the error without detecting Content-Type. In fact, moving the code of checking Content-Type to the front can fix the bug, so I would think it over of your pull request. Anyway, thanks for your contribution.

@ccloli ccloli added the bug label Jan 9, 2016
@xiaokangwang
Copy link
Contributor Author

In fact, this problem can also occur when Fetching Page, my pull request did not address this problem.
If you are going to fix this bug, please also consider this possibility.
Edit:typo

ccloli added a commit that referenced this issue Jan 13, 2016
…ect if generates Zip file successfully (see #18)
@sorasoras
Copy link

Is there a option to consider Retry if a download keeps pending for too long?
I encounter the same problem that some pending prevent the download from finish.

@ccloli
Copy link
Owner

ccloli commented Nov 12, 2016

@shing3232
If you just want to stop pending download, please follow this steps and choose some of them as you like.

  1. Move your mouse to progress table, over the status text, and you can see the status text changed to Force Abort button, and click it to force abort, download will stop and automatically try again until failed.
  2. Set timeout with a lower number (default is 300 seconds) on Settings.
  3. Enable abort downloading when speed is too low on Settings.

Anyway this issue is about a proxy client responses empty data and wrong status code, which makes script go wrong. If your problem is like this, please show me more details like console output. Or if you have any more question, or find the bug is not related to this issue, please open a new issue so that both you and me can track the bug-fixing progress easily. Thanks for your understanding. :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants