Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Net::HTTP to resolve rate limiting #280

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

ShiftaDeband
Copy link

@ShiftaDeband ShiftaDeband commented Feb 8, 2024

This is all based on #267 (comment) and @ee3e's work.

This resolves all rate limiting issues without the need of any delays/sleeps.

I am not sure that the http.finish() line in get_raw_list_from_api is in the correct place, so any code review would be helpful.

Regardless, I thought I'd submit this to try to resolve several of the issues that have come up lately.

Legitimately all credit should go to @ee3e for their solution. This helped me download a ridiculously large backup without issue (452831 files.)

(Issues) Resolves #277, resolves #275, resolves #273, resolves #269, resolves #267

(Pull requests) Resolves #268, resolves #266, resolves #262 (at least according to comments)

@bitdruid
Copy link

awesome. working fine ! but i'm interested into why the use of Net::HTTP overcomes the rate-limiting. Do you have any idea what the initial problem was?

@ShiftaDeband
Copy link
Author

awesome. working fine ! but i'm interested into why the use of Net::HTTP overcomes the rate-limiting. Do you have any idea what the initial problem was?

Essentially we're using the same persistent HTTP session to download the whole thing (both snapshots and pages) and keeping it open until it's complete rather than opening/closing several sessions, which the Wayback Machine doesn't like (even if you're using a legitimate browser!).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants