Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Random delays #51

Closed
Skallwar opened this issue Apr 30, 2020 · 8 comments · Fixed by #66
Closed

Feature: Random delays #51

Skallwar opened this issue Apr 30, 2020 · 8 comments · Fixed by #66
Assignees
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@Skallwar
Copy link
Owner

This is needed to avoid IP banning

@Skallwar Skallwar added enhancement New feature or request good first issue Good for newcomers labels Apr 30, 2020
@SpyrosRoum
Copy link
Contributor

SpyrosRoum commented May 5, 2020

How much of a delay would be a good amount and what kind of difference would be enough to not trigger a block?
I'm relatively new to Rust but I'd like to give this a try so if there is anything else I should know please share it

Edit: Would it be enough to generate a random number from, say 50 to 100, and add it to the base SLEEP_DURATION here?

@CohenArthur
Copy link
Collaborator

CohenArthur commented May 5, 2020

@SpyrosRoum the delay you linked to only happens when the queue is empty. If that sleep happens too much, then the application will just end since it means we ran out of links to visit. However, it's possible that the queue is empty for a little while, while some other threads are repopulating it. So we set the inactive threads to sleep for a little bit (SLEEP_DURATION) so that they're not running for nothing.

We need to add another type of delay when the queue is still populated so that we don't override the website's limits. Do you want to work on this ? I can assign you to it and you can ask all the questions you need here :)

Thanks a lot for giving it a try !

@SpyrosRoum
Copy link
Contributor

Ohh you are right, I didn't pay much attention at all (I just did a ctrl + f for sleep_duration xd)

So we would want to use a random delay only if we successfully got something from the queue. Which means we would sleep after we handled the url in the Ok arm of that same match. Right?

I would agree to be assigned to it but I feel like what I am thinking is too easy to not be already done by one of you guys so I may have the wrong idea of what I am getting my self into, here

If it is just adding a random delay after handling the message then sure, I can do that

@CohenArthur
Copy link
Collaborator

That sounds like a good idea ! The reason we haven't done it yet is there were other more important features to implement, and we're also quite busy. It's marked as a good first issue, so it is one of the easy ones, no worries :) some require a bit more understanding of other parts of the code but this one is fairly simple. I'll assign you :)

@SpyrosRoum
Copy link
Contributor

SpyrosRoum commented May 6, 2020

Amazing, I'll get right to it
Now back to my original question, do you have any idea of what would be a base value and what would be a good deviation between for every sleep?

This will inevitably slow down the whole thing so I guess we want to be as low as possible

Also, is using the rand lib acceptable?

Edit: Oh also since I am kinda new to github too, should I fork and create a new branch and then push to your master from my feature branch?

@CohenArthur
Copy link
Collaborator

I have zero idea about what a good delay would be haha. I'm sure there are some articles related to scraping that can help.

Using a lib if you can't find the required function in the standard lib is perfectly fine !

And regarding the process: You should fork, make your changes and then open a pull request against our master :) we'll review it then. Thanks again !

@SpyrosRoum
Copy link
Contributor

Alright, I have some good and some bad news
After looking around for a bit, some places suggested a delay of 10 - 20 seconds while some others said about 5 seconds should do it.

So I added a base of 2 seconds and an extra random number from 0 to 5. So in total the delay would be between 2 - 7 seconds

This means running suckit http://books.toscrape.com -j 8 (built for debug) for one minute downloaded 103 pages (which is very similar to the number you got from httrack)

Oh, and the good news is that it's working with a random delay now

@Skallwar
Copy link
Owner Author

Skallwar commented May 6, 2020

Thanks for the work, you rock :)
Some website don't need such a feature to be scraped. The goal here is to add an option for website which require this timing limitation.

With this option enabled, yes SuckIT will be slow but otherwise the performance will be the same as before

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants