Article: How to use Scrapy with IronWorker #93

Open
treeder opened this Issue May 14, 2012 · 5 comments

Comments

Projects
None yet
4 participants
Contributor

treeder commented May 14, 2012

No description provided.

@ghost ghost assigned paddycarver May 14, 2012

turian commented Sep 24, 2012

I would love to see this.

turian commented Sep 24, 2012

Or any Python distributed crawler.

Contributor

paddycarver commented Sep 24, 2012

I'll bump it up in priority. :) Watch this thread--when I clear off some of the more pressing stuff I'm doing, I'll work on this.

Contributor

paddycarver commented Oct 15, 2012

@turian Can I ask where you're running into difficulty? What would you like to see in an article? Trying to figure out how to approach this that won't end with us writing an example for every web scraper ever, because that's not really tenable.

Installing scrapy and lxml should do for now. When I test my workers locally they work, when i upload it the build fails with errors:

 No package 'libffi' found

removing: _configtest.c _configtest.o

c/_cffi_backend.c:13:17: fatal error: ffi.h: No such file or directory

compilation terminated.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment