Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crawlers won't shutdown gracefully on SIGINT #450

Closed
demji opened this issue Nov 1, 2013 · 4 comments
Closed

Crawlers won't shutdown gracefully on SIGINT #450

demji opened this issue Nov 1, 2013 · 4 comments
Labels
Milestone

Comments

@demji
Copy link

demji commented Nov 1, 2013

Hello,

Crawlers won't shutdown gracefully since CrawlerProcess' _start_crawler method pops crawlers off the self.crawlers list, which is where the stop method looks for crawlers to stop.

Steps to reproduce:

  1. scrapy crawl
  2. Send SIGINT via Ctrl-c
  3. Crawler continues running after displaying "2013-11-01 09:24:49-0400 [scrapy] INFO: Received SIGINT, shutting down gracefully. Send again to force"

$ scrapy version -v
Scrapy : 0.18.4
lxml : 3.2.3.0
libxml2 : 2.9.1
Twisted : 13.1.0
Python : 2.7.5 (default, Aug 17 2013, 13:35:16) - [GCC 4.6.3]
Platform: Linux-3.10.7-gentoo-r1-x86_64-Intel-R-Core-TM-2_Quad_CPU_Q9550@_2.83GHz-with-gentoo-2.2

Regards,
demji

@dangra
Copy link
Member

dangra commented Nov 5, 2013

@demji thanks!

@demji
Copy link
Author

demji commented Nov 5, 2013

@dangra Thank you! :)

@nickeldan
Copy link

I'm getting this exact issue on version 2.11.2.

@Gallaecio
Copy link
Member

Then please open a separate issue with a minimal, reproducible example.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants