Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I keep downloads for slow speed original URLs? #258

Open
KOBJING opened this issue Oct 30, 2023 · 1 comment
Open

How can I keep downloads for slow speed original URLs? #258

KOBJING opened this issue Oct 30, 2023 · 1 comment

Comments

@KOBJING
Copy link

KOBJING commented Oct 30, 2023

When I occasionally encounter an original URL with an incredibly slow download speed, such as only 0KB/s to 10KB/s, can I prevent the download from being interrupted and giving a Network Error?
I have the script set to not abort, but it disconnects.
The script seems to keep retraying the same original image URL over and over again, and although the Limits Cost is consumed each time, it disconnects again, and the download cannot be completed.
Is there any way left for me to get it to not disconnect at any slow download speed?
Or can I resume the download from the middle of the process when I retry?
I would like the download to complete no matter how unstable and time consuming the connection is...

@ccloli
Copy link
Owner

ccloli commented Oct 30, 2023

can I prevent the download from being interrupted and giving a Network Error?

If you've set timeout to 0 and didn't enable abort at low speed, then the script won't disconnect actively. The reason could be the client or server kill the connection, or there's a fatal error in the network chain, and you can do nothing about that.

The script seems to keep retraying the same original image URL over and over again, and although the Limits Cost is consumed each time, it disconnects again, and the download cannot be completed.

When the script retrys, it'll reuse the redirected url which will not cost again, unless it keeps failed for multiple time (default is 3).

Is there any way left for me to get it to not disconnect at any slow download speed?

Probably set a proxy or vpn may resolve some cases.

Or can I resume the download from the middle of the process when I retry?

Due to the limitation of GM api, there's no way to do that, but you can try other CLI or GUI tool that not running in a browser like gallery-dl, which should handle the case nicely.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants