-
-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing timeout option #28
Comments
@Jordi-m What if it returned an error (are you using |
I did add the catch, but in the last three days all the 40 websites are up and running; I'm unable to get this situation reproduced myself (even added a slow loading URL). If it becomes an issue again, I'll get back to this Github issue. Thank you for the suggestion so far. |
@Jordi-m I will keep an eye on it too. I'm also scraping lots of pages and didn't have this problem yet... |
@IonicaBizau It's been a while, but today I've spotted the problem mentioned in this issue again. I've immediately tried to find out which website it is. It turns out that this specific website is causing an infinite(?) HTTP redirect. When visiting this page in browser, the developer console shows HTTP 508: Loop detected. I suspect the library will just wait 'forever' for the redirect to stop? |
@Jordi-m Hmm, good point. Reported here: follow-redirects/follow-redirects#41 |
This should be fixed now. |
Hi,
I've tried to use this library to scrape about 40 website asynchronously. I do this by using the Promise object returned by ScrapeIt, and then doing something like this
Promise.all(promises).then( function(result){
The problem here is that one of the websites I scrape can be down/slow at unspecified times (I have no control over it). The problem with the library is that ScrapeIt never seems to time out (tried it for a few minutes, but it won't return and run the Promise.all ... code.
Any suggestion on how I can make it timeout (while still using the library promises)? Did I miss any option?
The text was updated successfully, but these errors were encountered: