Skip to content

scrapyd: job timeout #156

Closed
alexw23 opened this Issue Jul 9, 2012 · 1 comment

2 participants

@alexw23
alexw23 commented Jul 9, 2012

Just a small issue I've been having with scrapyd. Four of my jobs encountered MySQL errors and then grinded to a halt. Wondering how to handle this? The jobs have been running for over 2 days and therefore bottlenecked the queue.

Is it possible to set a timeout variable for jobs? Or alternatively, how would I exit the crawler on MySQL error (from the PipeLine)

@dangra
Scrapy project member
dangra commented Jan 29, 2013

please, route question like this to Scrapy mailing list http://groups.google.com/group/scrapy-users/

@dangra dangra closed this Jan 29, 2013
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.